var/home/core/zuul-output/0000755000175000017500000000000015157223166014535 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157232620015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000245733115157232535020275 0ustar corecore]5ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIsdr.k9Gf %?Eڤ펯_ˎ6_o#oVݏKf핷ox[o8W5]% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}}ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Š?2 FKX1QR(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢#ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiBx_2dd$YLYG(#?%U? ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿/h QZ*U1|t5wKOM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@v4If{5C/(\ Q] R['>v*;o57sp$3nC|]|[>ӸUKޥg9b2oII"9 1"6Dkſ~I=嚲W9ȝQEkT/*BR =v*.h4(^&-Wg̫b]OBEFδW~N 97;Zp0s]UIĀg)4 B^S4t; *퇄u p}du ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtW#:7erԮoQ#% H!PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDD}c6  %T%StC5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞJ|dhW)F5d,0SSN#s ayiBq)u%'4 yܽ yW0̿2ZҘ[a-0V&2D[dwl*?%|L pSROޔ8'mf\>UxR ks J)'u4iLaNIc2qdNA&aLQVD R0*06V۽棬mpھ*V I{a 0QiOENG{P;sHz"/2Zjǽ}W4D)3N*;D֪v3l"<, { Tms'oI& M_8.Ç"q ChCMAgSdL0#W+CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q݃&(sj 44찲XA E#EPV_WWUbUkYEPnO'nt3cۼ=z)dC' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1JgޛIgǽgr&P29LcIIGAɐ`P-\zʡP=_RFZx[|mi G ʹo7T׋b!g K#XoV甬6xڂ I &m>AtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns8tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1þh[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 iƝxG}/9nh7l%>'ct Հ}a>-:(QxPyA Z ULJ- upƜ/4cY\[|XsܑdI [@3YNє0vNۈ/:쇲=T u)1 QLLj`K -D,(7N*,< JDA?VǞ©H\@mϛ~W-ce{0d8G}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Sr??Ӽ]\+hSQזL{g6R/wD_tՄ.F+HP'AE; J j"b~K0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނLٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4!EZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċwm\"Mq9!@E " ;տIcZV 9B!-'¶6!=8?8[Y|-leǪzd;p-s~GM>e:9[v\:P 8'k01Q1jbX)/ΏL+ΆjBUx~Ga9Z"Q8_wjTLRˀtV5V:[g./0 +~䤗d7F >:֓[@ QPltsHtQ$J==O!;*>? -_m!7^nya+C} 6k!x^>& ^7 l 2Jj.Щ LkJu\!`0);Sak$Vfp~G8l WIfr\q4|UkC.`a@> zi 8& ׹U>8aK0%VR t!Lku`]c0h&)IVC)p| QUA:]XLQn?ko7V!8bdUK.'Z4`9);q`F"dq1v>ժbLGd~MP5} x52LMF9 EOUDYpWԽ/F^dNyj荊UEg_bÔF˩ք5UGƶ*NX)Hc(<|1L7^9Kb A;؝AΩ>V iэy(_άj]$9XN+/Sh]icc w0/ %IdUAkؖvbX܇BzU_#3rUC6u$S2$5|vg9G=u^|VM^rcG"E7\qtS:ڃUyy >Vc11*?xYa8 6L/>_x 'ۙz7p~);qU9GLT! 6]c_:VlnEUdn̶UˇKU ;V`FUݵޘEO[)ܾCy*8ª[cG1PH9pM.~9gK`-sB18 L3E*ldt:rXZ)|z&VFNY݀\lFN}fU9O  g`6~0}4t3Qf5xd(ЊZzc \ d5;hLj' +Jp[q v=؆ R1k':\nc^ zDpC65M'-uX9r`FITcx{q3'OID3d RcU0FGƇ9`]Tc\b+s~p"eXʰ1QzZ=ˮ!CjѕFd0C*Ћ%xu?IAZbbx>c~L :^`jSؑd 3]%i1pxFP>TU?!$VQ`Rc1wMDp.cj|>z '` |e*6q!3t-rX!~Pu(UeS@%·b:.SZ1d/!_nlN"jN;/-R%~ {_'##AA:K`uih VK.ԛiE׬gXS~+(f?\T)*qyqF(k:_\X%[paRu~}#Ѥ9U|A܌,ЃECTC Tnpho+8弔*A1s^>й qPq3T&mFqZbRT1e8V SncȱLȍ-03cu0:U[tp^}{~YhilS&Z!֨řoҚ*HKX 6„=zҌ5+P1;և6UE@Uo/18.fg7_]k]kDΡ1R0/!_Ɩ} 8& Ў3)X<,se7fS;W{@dlG)%OعF&4D&u.Im9cH,HCԢ[b p@-LG+@_$c%* jRr\:dc5u= A`U1cJ0Gk@5^NtY"FU#X) b0G'+_1&1 7ɜc:xF 2侯}>Ir~Z#+A<й l~R9zyƖݗ&!jC`@ qэ-V Rt2m%K6lX)"Clj齔GY:VmSk}qc?Nm]'`#&<6JJ4U88A1gXey-,)i7Ո: tC\?c%7\9:W_´5 >Ę_"+BLu>'Ɩ=xn[⠋" ѫQvPlp;3F$Zveju|Ɩ},X n*l[hq\A2) ' RGvh?>)-&188 \6y 8V p-lqrG]?~/3jsqE~ sjZ+9[rAJsT=~#02ݬ,f¸nj,@2"?WYؾNr<V`=V[B5!ZZ\ļXǪЎrce1P[;qsC.č-]W‹y??ڍ jw B}\FW+[3=X`BZWX Zd>t.U8ǖ\}FN`'m~R:_gyBPj|Lc~wfIuRuݗ㺾|47_eys5~WEqws]]|-ǫ\}-J.MLmc ;7ԗWrU}-Ǜ+sXn[>eywyY]]¨Kpx c./mo;_9ߤ"Th $LqQj0ւSzAdh.,$@bѨL|T=cK'X nݔ͆!`c|Lu_~~Ui/ҿO6H+Ȯ6g ,}U=̀*Eg.6_~OJ}/ >;dOUữMR3V:<}xXh//T+}oY5Ȯ4/>b``U=dBzqV=NBk1;HC78E@UMc8>`TZ+bR"%RR7I{Ӵ-pEAK^dJcIx֍6x3Cr&>FU,1X( A˨tɎ¤^C)ZY UIe2ܕk懪 ㇪ H` Q * xGyrTДci UQJr2Z'0XŨD1ڜ7[]Y,RGKmPr&Bf~j%2O`R<**2S_j; V3P/n$m=Jy,0L4wc7 U]YtFS!'| I0$S18/8Oa ! ( pO8n}sgC@48//o%e'FKN؟a^h 4Kܥ/[Lҗ#Nh9yY\qJP:3A\wg/+iyY\fHXG+ʉ0 ŝ0Cߨ9BkU͵ŊgE5qMYw#b̃s{]e勛Lz~(!F l 왧ﮃxf!S=Lz ҽ8vy/G[9 % -H0/I?շ_,VPx{$/IOSƂ/I_ⓘA`J3<8I@ fa|j\`3?}>{$FD䓥mFhUEw(OιK_OcjRŤ__LƖI\57QLE5hARAAb8{o:T6rc9Bh??I<y\[w՗>vXoǏt-0F+R#x/=[Kd1[^-݅"gBohP&s0d_g^,ijC껖׫GJ elP;yDH E8LTŸÁeCkYFEǫ6"k g>!8\ZL'tZdf^(ۤe3K`v0Nz4Zf~2(p(HQ|YPP[]qx2dZ^"xE<@>{Xy#`lXGf(@yD2 ,E^fR@djETrQA|rdĨZQ\PJ[Uԛh*hQUyzrOw/'(I3J0|K`yEu2 bXTZRQ:ِRUFP=,?K E00oYXT~)SJL% PºĽ꿺k4 @:_r1 v1 Q"a\?`Λߟb~4*X4e7q|F@^B!؞.'&%#+yyt4CNQĝ G4cil%g.ZGݧm<.Bi9Ѳϣ:U+!3DZ03L-$87g<,VpMoIc_Ȫ^ ק& Zi+u&> zJx%Όt'Ex{JAJ尟_0PE7+:3cK]h95]Q"Rv_M7H"KFmҨa1gTo:;"K#"("7izpZdsF?|oP sӅ(d%ǦSrRhSC9Zga}U$X&@eiʭ1$Ӫ"9OT eQ`ӟAS^$8-,B:3SӋ&DǘSg?v:=tS[ZETײe ԣ&lV -ݰ 4?~{wa P^9n;PQ :-E(AR_qt4Ko. uHGqS'qwHZ4|uqň$mi*>@TX ZYvu|DMu&fCx tI qVw D# N㱰p\4AdDAsh?0RI! K&S_ţz(e_colψ$Y'wPhњlVWZs7&f9>j_8M`DQ8>‘dRb=,lsKAqP l5G I hlZywK:jn)k`ݑXzaYstŁi?VqܝVCfR,i_˨̊ _-WR]qSpt7ջ-aUE<ovK= uj,q튷dW]T$̅#rf8d1Q3xfyB؃uS^~X$~=9 ( ()q-'m#ZsYsDZZn)7W}kGi4ZFyCTF Տ>Ty;A䬞aB 4e9ۉ-Ca>;d'>ԅR]BldyTC֙tR%>VQT.J|b@w,)wRL`Ce:q=~_Dɛ!Ĕ#nj X*wޕ{(rm*\eBU!2jxflǦ;Pu;<3feU7Qea0W_˂ 7ӬkFtRΑPy$a"ВªKwX^PEƪ](CBa=LӄK*P6|u w-SSSpuKS[ Xdu)̕g)Zr텈 &@D_-Ys0f\ǘEipGqF\NP]$AOG0. jVV*7T,Fdg:/V{J|Ihdcp뼱Rnq}dmW;羱 Z]d18.Cvw$g[tqNg$U#A}h$2V#}XO44 Dzڑ`Omk dZYU6C.Wga5̶A&4\gcQk7QLͣΐ1S=m[bub=GT<Ԋ˶F0w{@ 8.݋pslkDfdW{{4E82ňIvCp)fj#pGDŽcBFN`4DZ:/,{5l#W_=Dcz_XR݁Әf$ѡ*{[灨kIKPfYb<GPB? ѭsV>>n\N2?74:V,].ee1l.)^ rg0,L '㸶?,qDsjW-g&e;tKӭ0kzmGLu?kd&YߑXcJUi D?t;wPtmS'Rn^:mO;OiZ:zuV}X%;n;izԋz mR~+ǢNNL#U*Hvp^q;[瑮v>x]Nv (p=,lQ5]y@.:SPEETf{0LwqztwKKZoBT洭C,L0i#ASdLG>.cމ].152.̋U5/ptq=l3m*.[e<HU3:j<#Yu&Lyetj)ryV˸}HҫUgq6F+ޏZ^+|g:-#]K3qtǵ\eDvtWJknz2Q}oqؼݰ5n![<9W77-r(/i{?Isr M8-CM[m}''<SLGpoFN9;9"\sCϩg4sPe?cyn!zj٧[=EOֹ-u չSrk͍zj.Cj}-9~FF}[~hfnS@zqB>B~E<$r!#Urn#B|@=͛q!zwr۵!*#sfե^ml♏XKAB= ѹJYVِw]wpCYg:ƒe3DFomgGrIP(I~ms߬+m[K gnHƒYouN|ݙ&Hf#ZRc= Yv%CػmH"A;y.y>tWos]Q|FINμV\[Oƭ+8Fè0б(n_/FP ܿntlpMtZN,mO¾i*$hZoB uYbj H(@+<W1LWeYIF# Ioa+]hr0ΆEpGrJ4[2ŨQ3EXJ+Cr j(h]f* Xd,TӡkoT}P?ktY_W_}Baј)~QES&k xƳ NT*M6 GOeK7|᷊9-%X"W`: crH,BoE<`V9`I)c!-<@ufJ7ǧ@ <@\5!X]5K' R_k IՓ_;RMs7n{C=G~2ݔ;^8Huٵn#zAO8=JUzj 'IJ^2dCh=I.(X[\|2!}YVn/ Wȯ&5{E({QL>9Ja6*2=/\\u5uE'뫱A-vS,0 O AU[+Or-WV* -X?UL*  m $Bmց={,sGvm*v@&`}q)e~%͓<~A~O^F͂^O*X5t.wtZw9Oek'(04g( So]FS;G7_NQޫ{8N_v(^<-n\&2=EM=~ʮa`3KރZ~8)K]m3tDE^\ C];Q>4z(eQadeΆC?8HJ|CnRM] e6Mz '<׳@O@I_ð=1AFOɏQU_½>pi ']j&Xs}Qׇ:t4IĢ1Ế8~T hCq@d[9s-I'Zc)(6󹽦@olĕ8c65fz },pF>I4#GUC,Ɔj5پpmF~` ePEj]u`5=qg?I42+02Ahfq"[EŠejT4V?{WCFT .fbA<3O:ӋxF0C{h4S}ZzaG5P~5tC+O(@&mw'(߭EԀT|wnI[[sُ- .5>Ķ9,2[q2P &tuSXY*ݝjeAcw5N?@_^XQQ)C Y"e,7k|<^mshS:U:Ihè!pK8${ii/@p W6\2bvo#IPͨgL_]sZUﯢ璄#T`~1 >u7ep=+:4GD=AzC+M!gC'T= ,'r,A=v‚Yxx&R-TĖ#;dpմ'2 ~R6bg3aj1SHd6@z =Z餞54KIͦۊ &/ .>0> vUkoKɳZ@}:g3Yݸń/pMvqFE x2U[Xt:DaT3v {iWr.Fa eOlÛmxƏpL/3{'J Y,bÿ%KLr"<à`pYSW%2"ר}WK¨m5<ᑾ6Aڋ ؇(z&.f!?4QM[3JDS>,hs"a 'Vv֯ C+.ӵL)mLJ1j&D[ \RɁr8 S a(8[uZpŻ5bZ ^'` M`}g˫>/Y%8_"˄̾BD6p^;?}[OUv[g6/Pi84IzNU -*Ne:KB'R?qŠAV F2NӐTD71[?Yy @Q5YQTA:@B?V:>g/QMnz݁Cwe 9j@zxT;nߜկ[VK?ꇏ=ȯWhr˽ '%,x vp3Kj+?ߒJN ^o:FkY߶Ϸ2c97EySOf uޚ Ɛ@0CCtUTԼM 9|QeF{_1*K |wx*238$M~ ;maw8Uz$Axs]J3H<:j;XA|̦Z@P-}Yc"7L34fa8:?n| P"AU+sG_|2[Lw>mRQc:ٚT.Y*7ᣡ9$ y4OdJE:Uiu(%yY<O&MWٹ€TPE qw1hOL C;уKN'Xuږ{ҫR8 ^d!yuV \$\&L &E+^p3Hg7>7$̓5C2!cNd7ڥL\@wFY'E1\[5F~a,i}FlA`iki)Ji/ZGqpTj4HP]F{][&p~I`-zLɂ.W |ӈ Ggǽw] v]!__)OW?nvAG4FAouސTdƛMb82yb>@x΀h4W-XV@ABF!ãdww2&##q8붑PJYO%C5+]ίP68ίtD5tj?ey_][o[ɑ/Zb`*ѢD%[>"3bb׀ӝ.|uq#&_&՟[/p۰<ݫƃz򃗲&)'P?mf=O2->҉v1;5 ]W,|WWs4] lO?'=i6v<쯥EkW/`4%wj|f/o _pܕ7,_ϱIˆz(Nv#twi١n7&bYxR^ 68ֿNğmulOl3N/=(cMQjصsLTG.L(_ɦа%sL<,mY Y@{+@#R#f?P$A_Al fYva؋N EХ T Bx:%t{$?P$UCqd 3E {u@ΎͿ3So4G>m+5Ѽ3o~:tэV$.Z0Ka!kPJ,.p_$zXpy .m AD-YMIg (z|ꂣ+a"x\ _+Ĩc ˚HtWtBtv)Aݩ̠e N3`-c%&H:e<~J+t ?J:-V J1au&zIPUvŐ( %Vk _cħCڃΆ\Q 1.ٱ%$"Z* kl23ǨD!@Gېn<+KxYoE;8*5ֲhGh^S$݊Tc 'Ֆ(-hT%C>ȃiO֊ZZ95tUED_ՓT=.7.8j[^|c6.8ΖB¸/9@Ihǜ"oTs>V %sז׏C(/qK)bNk]kjp VTc[ɡ :9'H:;é͆m}uP\hyV%!e` ^&(m1[`cn*]5x,ON IKQ$)?wq .VLz(,֡њeoMZ}d+@(]BFAzI=Fbl "o^^UP v1_]7n˷.йa)FX5Ki ^+<\,{Ya%h~|1|94*^jQ/f¨DjV^/-I#գz[5R귐 [/^m؎!PGt`ͨ<9[M@fXG(pI=ӚEy23)YZP]p\VbFjSӎU6d ]|gF#18YmsUb)bS>52FI㡱ra"3b!׬UEwB;I!k pP긪Sxf0ʥWE$UPޚmZ88!EKnTX_XH:oȹE_5)RRѳbY~)FqBkJd07OUQDҡwWl*ăҖDp\ꂣS9hRMi4+ %tJhG.h45K{I3Kұ ($װ)Cmy6-9XrI .)Ԕ~y.misUK3Gxvʷ K1fDfw.,ངkϦצyN8"Etޫ.R`0x@ Oj)Ϯ Ht$d lA= Qʹ$R뻰#%aZ.]e1]=C@k y6ؠT%SÈxIx9RČ|%dQ3&&)S9>gy((V\k/^K:sk"?h"_9 =hJ3 LWn֫sዪvr[RgggLv)W@Qq%`@r_^jr bҶv9ƤW?ԘdJDj]b5r:%DU6^qoz#)-q= TܿzSoXTl[i=f^1 *:\mR+# bjͺtC-Ǻ6*F  } Be8Κy)4ZEZO uEj!|ö ʥܘ81xsOf f[)dgUi*J{>l!@:>UZ3s-\hVN~i[ H:Ώw _ WHYy~0K%8׼9a+u(JIN.ra~a;f؆êqOnQ 5)+E;>R\AuBn/(_><݇*N/F=~no PsMON򨒯%+&D sŝw\mcyQN~^obXߵwwɧfWM;MS6!v[.uNMk B7xepOf5aue)1m}u5ɨvhߊ #b", .nՒ|-;DV݄E^p Qv82`14bPڹxMj!tfEA!|YQq1bEkdvN}pKt/D9Cޛe2Y(\A0j!8tĞCe>3t,kS2ip] |so>^?TRn{u(*E8s)J_ੂQ.\Cs])]47]̉=Bt_:&ٞVJ[M|P68D҄Hb {.=lK^ӵ2op:l9Lٔygö/0E1Ҟ]s9{A׬閒z{E`*hzIۚW hȖZa!H:zz8z ˌ!RapF^)32Ufsg ^ mc;luPn**1DWYTČEFߴw\vO;Y$nK3:%ֶ*SŻg.xeݒ3 -y;^}jQ1ᬐO 2^wISl>+Eb;,ME*BԱx9 M%!<ŤE/=$~ŷwT]k75w< t:z M8C`1a5/a:J(Yw:-=Ub$>mGׯn3NNs'Zgӽ[qZa|1jz^q]D(LG'Q)q*nvwM9ӵe߲'?0ʞ[8oze\|_bR^x S )/qԸ%3㱱WǿTu#bN(^͜O~0ή:, GI遍K#$x`O=6H|G7{ZUN7cϳɲ߳72<_`~ԋQѨ{]+q嵿N~7f?y gp G{.?RB@X8 xhqщh&H ~G(j>s i f54`qJqFlg4zk ZK$HDP]K.4!5rzL>WD15]6E߳$8Ūӯ`lC7OjWV/uM96tx5z&85GAӻ3|앷ɦ'?w٭ 鏯nO2x#<]G"948IV.,E >~äl.7~8mc:ɂbsQ8G?/o$d?o1B@{AUy"8iߟDwuo| p2?E'qMFFH#pj$$xûĸzRԉQA\<2qޖ*'+aٌஅŶh ܏@4sJ! "qPqSF(1@Iwd'Y"cÑPyy^$)vntYߛgݍ!<.cavbDK@>Z tC32u{w)DO ?eq2&ʓQVsbN|^8Q8 B%=y5HB ]=gܸD ?eSVYyy.h/fӅ7eΌMC a/DKf=0%wB@auQWAS^M%)M#E7$~&c_xii\BN{bOK%pgh/z8(eWP?ĝQ/ܾ]y7ݳ(:̔.ฅ A?[Yܯ{5,ƇqA1u:k#TX0y$qq%O;=j96Z'.nS+-41vyBjk+9;\lvTKEZzPX ; Tm?n2>8oyHUڮ|ID(t;n |磴s.oӷ?t:Də"xbۧeyy|T)Ve4(q<|-=o>f7oqVvRD az?v|2pZCK=վoy%.C{o~@(>Fx$dk"ŚFQj7$*o~84 F&t֫pE+`X@MߢsModʜӒFS ڃobO%'Y ?izW@AMfuUulZ!j$ALnIEV@41n֘l8Ƿ4dWׁi|9jN/jnk4AsKph_A`HOJ%m/*}٨yqvZl݄`W~eq.$RʭOAo(jYud 3kI һ1EpBbº>B}]&[CL0{/J}_XO\,IYk[Ej5y5=P8堙)[S4OIyuUj\R᧎hgR/3Yg+da4nfO<7-OUM[>jXd@ZiEHiHpAP K ٗ {XTf_z~\"a= ~}+O-DչKm6N jo!i ].`ʄ} ԳP~p|T `wgKǽd|$2X8s3bsGyE`6Hhr>c&7,tdW5x62"M~Zb Fl}ʢ[j4!d)NA^8̘5N/,& A>ڐ깺mcZƈ5fhtҞ5|UZJVUz3ܶrs %]-9u4 JqpJAyt> qLujRx؄b|";vP)Ug!A;X-50$15HrAʢ#DL\eKc!u"Le PWuA8aB;g6&P** SCG `thf]2Z DZuIZ.BKM.BץQYUc, fd!I"SrXP!ޕ8"JKF 6E4q%)|=brɺț4xůwnxMo)(kAƘ? Ie3):-(RB%2!(ݢ(I4R\c' `,)!ƈ "~R `2κ Ck:\ƺ\VSҠyw";% q!iHk倖1*&W2!AW~p) {c ƪ9%& 8WAs.y4VuF385VNхp L?=h5B#Z 5"c/,Cڗ5R6ԟ ཅ*W&G4^9 X%kdjO@5(P/o7Z b]_}n ݴ޻);ds&!p@MHqu 0塃>JY1jʇ" ’^yQŭmuU!4M'}륕ݚ؞4/vzx4-n$u]&#,1xpYkAa XwܚѻɵW_v n``g:ea4KႪ菑1)|mKnO>lUc*l? OfwEa. ϷuF9QɈhc޽` _ym'\6=Ld(ŕ@6k b6$3&{^-20!S\s!" 'Z៿G2xkt?N!o3X~, N]ڧ]q7vfi{soV7vqvjeߛ2iV o:XpW,|zpA d:q쓙g ٓ:cKrwh34/pEiW_$Dխ%nhE;t/ХM7t>,.Kwp;57lEa=U W<45:evqu؏[/Y$2z<#=^rN#X, Y|2ӉysgD r1ˏ+%CB!Āaq(]J6cr%_P:Ʋ#R#v *vFh=>lM~,.+Gu-@p#r<ӿ絎S$썽zYa>U9w)..|tdP̏&Y~\.W@&DCܵ/q`kٿ6m;px,q3m`֬370C[`F-VR:kMoׁײf9 cڏ&.fWX`tVq9܂.E7^Ϣ >'0J!`K;MGѥHSyY)KUQǂ?Ň2e5l>~H38?;ǸzWC\pNB r8f:4@oN4UE߼АGmφ 0lN&.o 2whv} {=y3uV;x~? K{ȶZR:wٽ)ǩƕZ9I_;I4g/c J3f#|-ً g#7~yG;&ۑZ*8=$= ݕ+tZjk/&xl~;X3p~y9p:Ȯ&A l9'9.ҫ.2_3MK_ gi 3W!|钒; 9;enQ= ,o+1| z!(XL!Pf#CwQ6RA6.ֽ5F{S>`Ih`ukvE2+*DjKy]ɳhSmJT)?Yߝk)|_~cJΥ>ηz) CmQ>h4" zk9jFc.#{”78pZ9JvHsP)O+VTuDjI_XH~Y`4OOV _ {}f-%D+EA0C[֪Cj.*:⎍3iUQfN,QoBvHz\YllM?QjU׭Q=J⧪EGFsܹxĝj3pZ(vv`p{Z!)RlN}95n p٦ SS}34bnгM? 1Zp#g?_֙A1!Wo>DL#1B!eܐ\Dy*nn A+mJuJMI\",qv>/U)'BTB%U:.IM+C\7}Է/ $5$ *}2ÔF}v(P e1 P^L#c{d 0$9\IÞc%Tvc{FD% ?n96Լ'oSn6})߾_o,hAtWVnfoˏ&p_~0?~p}_~p?~Nx>|!A0:І&ŘPV(BKkh3ik;M(iM'mPO ZJI8Lڂ_˚S&^./{5׍t.HV^tT\H+UOtM~kBޑ>?4 VDSe'U֗G>U^lT5RpG2'o :ɂ,qS?=AYY(s$- p[@fi𫾦[>6(ͼd8;9-zQɄ4&\&:V,bR ."cFp)8aH/(%r竖c {eU]#pnX6&XD]K^/G&fq^3yk[z4'$`b?s?hf9v患>ӏ~sز>ZO ҿғUԿ2v(_Kt+}5zG]tpsx585ߖwIPP)4Ց*XIb""C,c+r%cČb:JtZ^0*f|eW"tk&b7VݜZn_.K"_~eᚁ+xN>0 nS`0Z7?Z@^LBIp ""^ZEhJDjڕ'tD֠Kf v&.755sM0ϛQV{~_ &E?ıX>noF=q3pqظ\3>|*Ыzj/>?~䇦Lճ˰G cTճZQ=k'kVXd}vNļoR7C\aSa&&|` E݄`*x݄\&uWv Ke+y .9Jbu>|*ޮtgۋᯅL+t<|ћՈތS NbG LxEFR_H"F mEozO z#/zTQ![+[j[(uJиCnV6 ÜgX{ CBgK81YIJ2L>їQ~d<(v{ΞaP^l BϰBۀvy$'*KNC Uws:MU[ ?{WI@>Hj!p_8뭑Ğdr2].vbz"2B\6g±a0yh uF8a.p[v+6`9uǧ֎7"*''gij&<}"Oa(pE~n!^Lg2\"nX|\&h~0F ބєGoGvSiZBb0XЂyxy]2#G_t`LKxc%S[tNNvURY$NOX&q axQa[u3=3f=LQoLS؝a|31K[%&bziӎj+Qbzo6*Q\Z<y-7!&.Xj?͵b͌Q,wTO,Sםh 6A"1f1JlxZ_kZ~C!Mj4hF[n;bcL ݏƘbc̭e?#1&$"3T*w?``C{mX`V_>۾p~%6viۢ~8~xxe W)ιw_kеgs[ړV1*ZI:]L݉Yd˧m|?WZ/@kI{*& --K#woZZ@-3 HُKGQUnfOFyC7{0(=ed&aLá LyUxXC ݏ*x_ë}1QO)KM̑(l;;Q԰} rð̓*g"@OـHPڳ-rϩӮ*^6}߁_>sV{I$0u}wr&kx~mbbJ }0ƨi!Kٰ9j*3g),FNLԵrmpM {ych=foglk6Z"hLXXNK?Wg=N-|nӏ/~_>|װ*#tj<4hۛǷpU GKLmYx)U_G#܌-T%IǍLy"S)9j:CnJħNe(uSreaׇ=1~U:5cz xJq̒ސ:`}]>PG;6M\&Si UJqR}mɄ{[U[;uOLĺ_<|R}p_so[8S<ȹu#5)|5 6k57kMd5PIoM-Vn z5]ۊu /BM7A-7Mþ+d:$-ŐvkPh0M/SݰEx>M^L åՈ,kzYVF֬p'23Jb0IoȘ{ 9/c5 q z rۧM5ǝO-d)KqFzۍa9D:ǟ1~h!ct@v|IXl /b1Y12z|_d!MS}q$kϟb w@j-a*INxאDJ!gkܵF2RTY0_C0}Gׇ /N32:H#gbhaK>yXj;f&L}^LYL_,̴xG#Aʤ t=Fja%Kllpvw3XKXn[Xٰhhܔ3@ -Xo4R2'ß=Hc&zϔ,;TqNctL+sduf*(&3%zϏ2zuk2Rk/se/ǖ.>X[[X9^^,є:5+2S RXoaz {4R"# BMa+eTc,{Ry[Xy^f#2aހeY"z =KI#E"#%c[Xﹲ^ASѺǛK%oKCazSfjUX:zϐ SS)-nֻ .8#K%cKX)޳d*ְ4SίL 8n{Z5\ q[}}Pu\~oi:ő0t<0Y~ߣ`ڲ]V"-m[wjH}+>w8a /_?;5[}ykZ}٘8p ".4/+sX/8jl !r=u=ZۮNPtbg58cxo _Ozx+V'n׿~*|׻$!kAXs>_wK@ŧc}OKl# 8S V+iGؚ1޵F]# M DluHVP]i5yu'۷޴t L:!,Мү_9D_:B_#="=p6a۴w.W\ېjZ|6n%e|a PM`W&4b}44+d=@匮0g%0db Ɵ{#5VlTM?=`wc(y >q'MAY0v|}i*4Q8ۄKǡ=<*= X-v 7• .I- @d1q_v; j|DюR;b:6 0e-2 2i2'c5{B%Ej]Hђ߇a|$bߣE\Y08@WdH㪜CF©雾:\h-_(Ct@H*dg[n}8u}ɲ; 7,]p$ !f!<1% =wJbL[AբqHM YWn#F K5S BObBbd%|XQQ&JXa?}n7Qϧ +Xr;s)"փ!Ι!sT y9<Ծ|Ȇ'U˕x_MkMvq hP%vЪq:#4Ck%3慬W ^="tHy Y덷BV-yg"RJ^1yX.V(Xض4>|UGd/7g V-bG0/d=Zқ[ƭ&umIo;c8[2%0 訢O_XPw٘ d=%N:վ<'c᫂_o.nʅ1>PG,9R@f^j@-h xZHBWLX{qp& f\vٟv騫rU xa'c!vKBY!A.2l=[5l;)q˖͗8V缥QF{0R> Bփ%,,yKZRWa0'%1ж j4I!@tRU\W..xBwc(@m\>"ξ|t4/uLnUct5 =GZCVRrJbhz(6Ae I(7+d=(! F.A;} `"O4*WS'Y A. ku /@ND`kxwX4{An~X` /;K=!,DJ8ej}1w,$Sepwo*4ɽ4RDvvhmt tzJzܻ7AO4& $%Y ha*jcm<VqV[<@8VnLX Ţ$pb@-hd [).g_Ib|ر; f(px)h(_J=ȡ " 0 YhycЈ-ic_^7!w [ ewªq%Aj&/9f*4o7//;ވs*)[NqoX,Mta-7%Le0{B1⅌Y$!pe4T\'I+m]dDփ1@)hXȺ.KH:KMa!UDތߙNAy!@! kB2!]^,CbJ`p-QO T YT9ƒ08L8LU4r Bƀ$3/d5 Fބ Aowwc;314ȰKО$d=@)6PX () QpTF ((慬(<+@ hT-sst]v7aJee$88M$Aլ@w=n\~h36c.M SDexY˞dZ!%e"gl5hb[uxx琇f,mhP!.\_鸰" ;Vօ @%MHpw3B8Xaqys LH +J`BO;4"E9TH]f;,r:.-Ҕ2u\:0}.fP "*5 qBNƥ9&Kqӎ/v߭7(3TJFŊɘ_2$!W LY+/-EF'ҟMD`A.RޢCPhr DlXhH?r:; M69tpZrcX+"({-dYP,;3Fw-߂#AJm!*R3&FTJ0߿;+s!ĖqD("b2NƑ%9A~m5#os`Gkb %bn+eDgrtinb5o y`wqlN&*ހܺK0Ǔ~F6zX.bLjrٻf8@e?OB!ω)Wdz§F8ߌjuss}c A\bjNT n&#;Rj,PMzX mT`Ы zT=i&Q0QaV[3=R9H#K 2qڽooM>7R*t4۞.7W;傜tk-ypgq4 ,,٠v#~۫8$J?Nn5 yU: R\*^HaV[= p 86!L35===HiH6ic8::$"J'Dt@sO$uP@TJ"GƯS<<ͤaP<(`Jc{6fJ*bK#l!_}2z}L#,OƩG G:A~_^Ղ_0rƐS}&ߚxk7?|/:W(1 üչӇ:ݼav Z7, !X&r#َ57d-Z֡ah}PaH(_m AR_ɌKBm^2$ !ۜ;Ӟ YjḇyXyur|vKI#TP_ I|J$ zVY7"mnX/1oP@ǰZz(QKjUr:,Bu!&0yرg2ZVڲҖ>Vȗ" vIi+PW C   +~ 㠸gz[۲bsT͵lJ Ms=h,c6,#&c[@;,vi3Y1y."o2qѦ Pd P% 8ApS_ʩl"]^xrŲTTZ֭Eٻ1AN PFH,mRaH3,V)q#lōM ~K0UI.`,xm@QQP0 -IN 5BP&[A)޼n-W[XH&l|Ҕ+E K|R=O|ʺsX_ࣨnGuj;*| B >Z pRrB2暴$i;IN>IRKAQ9󘔞$v$m'IIVJB[U0}nXЁƶ4-}4`K2NƲKpQ<or7E.t\ 0%SROvbqjY<ӏ>]#-Q8ꀐ[1|W/3/x ;Df걼t(O45AcqjLjL8ʀWZie@(YW6w>:eO\3S}e'1/Ý}jX{v aȷz8` U2vHh^+&: 6OR'hAn˭A:WqL,ތ'L[:? -r~'߬K+\LSWƨX9ԡpd>z ]@,ߺ1W9⁧u}*-D<5I>1{_ xW@7e~5UtwDMĔTn#t_fVk*PZpW"n)0nH`Aȑݼ7i%')ޭ7[0Ӊ0N.g(oEr8n~ 9PR䳘#D~T弛1f޿7^[޸}<&K_@wf#}0 ~+gQĖ^k|1[GnfأDCFZ<GRFzALx1v8:PZ%Us y$7o"&c#<7nx{qXznwvP̛ H npӆN&ЅlEZǷ~/u;ZH۬ꀅ;u\tl0ܹ02wbf"+I4SZ͌%D' HQB-L}ԧx,)G x1/_V'6H 0ՂrasC;{qw{4:l<M7y? Hۜ5b%[-vU޲/{;%sk3JR|R87Pr:}d)v_U՚7CʫÂRSV)8r QE1Eeeb' _ddz^swgߧ+%K)bjQ:ED/z^/_K#&M|üm|Q\ I襱JU6ƘnRi,*W$X-~XrѬԉg.eū1 O_;A8KRXD3ݩT1CfR0O#1`O3~1i(.G I8b. %Y!Ɠ@x٩1O`)[t)]wmy^j/ _;0I>+d /Y17xB l:t.oAZ@S]/LJCVjX՝+@$]@cɓTZ#t,HL%|P`}afjHτ'@Vԟ'<_@Mt@JW"ݍWFڢZ.f'B|V'`7p)t\ 2&SAchINqedZ&NfI=c#o܂ k8Yty4M-fTtJ.8c}wv;3f~h^hwA9Vǚn$&Ŝ{5csf.k$-# r/ȽԌ{҂JtbF.tXm&s<@^ TMǟFQDrоGK63a@7$lcf T *,@5'vpAh`SS\xi@EDD%"EVhe5tnY>#.1>Y#W{hqK^v x^t-tϫVB8w`>ָS1]utL`Ŏ}+=.i$D\D۴? .^Q'>sԆGOc͒Z. WC5o7eӼؠS>+/g˲EX[.W ܜB9%:A/x _ bi<ٝ rp2Pl]uѥ{]UoWgZ$2]tjY 5~:׹TS|_w4?J^Sj-ʬ[_ (G??_eN(_^yGc/7jT :M )7QNYWT(i[R^2S7u#~<,Du΃ 6s#I*s%`2.v(-n4 ۡ[Tqe~b*\B5Ɠ(2#HfpƙH5*%$&JL,b ┹c)׊ RiޯXjae)Ju,1RdRD"pi8%IJIFLTdzH٘`c]ٻ,"IS&mR0͉:E27_ 11g6MΆBȃ20"ǔVZk % Xamm D4И1 T#24Kb<2&AiAjMYSCN77Pl)8Omq1,% 24&ŭd[̌E(|LMRwum$~JP[_EI``O[kYTD;`{mJ2e)MEu9u-?)e%lFZ+ $eʢrQXWDEK3Rzm-,CA됔2pa?CF5**Ré] zrʼnƬ2:ּk949Ț,pqyx/dp AO:[9Dgx󮮮/cYxc*#mJ U!Ec}J_W2X VgUNȪs#s( ζh:xѬɵ STBfu}dca4n )Q;Qƪ)L>b+4V 3_GE܆6Z)R@(%BB>im-Uल8D`1zԾuړ(.z1D v%d)0sc`䒳"ac V@`FY)dSh! Qc͕Rl SI*ewh\ zXoS \0'*笀<6Ԡ7n2P)bP&[x([geoxr@w~RxŬQ@s|#(*SNdvNR_00,jg7`wjlםWMے11- LМ6fN>FjAuP< ޕ2n;@^^JWі®2@fa1[4;ඃ5F P|"4M"eLB*c>8M>)Ý.lKh\ # 0|IEHV/Vgx4 , x1punT% 9U+(ΈJʁPnLe5'GTv*@?%n_ΣO*q>e[&J,|˅NEpjI7U5C"2G&}L(Y%,h-fl9M@J6_@WHGY geMk$$,G޲XP)BMVDt 2ߨvB Ӓ|aȔZ 66Ѧ`y_v{t輼Wr>\{εLј@E ԍ7H7ӌJFB`=ೣɠMŰƬZkA#yhY-:h4vfcҌ%gзUfU(&% <%:` ArhSbH~(CG؍Z} LӣNJtt X&R $CB6 =>^roܮf0l z+BJ1$'Wq:7>ud13 axaO1 exR%f$djbb]Yp`\tai#ʅp^'x hNLUY3fn]H5j׃*U|RUfR &Cp ZRY[]>VKqנ|+Mx57SAP  IcT>i1  F`<0ht0q -+Bb~)Ah z|dNzOQS{p8%oCIb $)Э_F=Ap"lƎZĔ5*5K}``eSFwLf5ԤQ)g $CtN]Xz즀+9bRkaMeQFQցR `j?nЯ^ܴX[a7%*+;Ϩy@wW lӖ>ujl:o`ޔ( maZR79u5],svqZCxv9Gylx9b']/ϥrg|Ο'Vz%Œ?\*_]Fė7+p=jwAFq':NNNNNNNNNNNNNNNNNNNNNNNlP4 ;0:LPhC8 :V=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:=Ӄ:'q }JAh)=Qd:@K:@i|A{PuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuzPuN&wGO?©_g/~❠|vߘreV~XoSW>:Nj ,./x f ;07OlP``홀ɔb"`e~։=O(i%D?20O !h5h_n2,) ZIH OZ+=wS3ekX\f? __qt\\%/7,ʫ<{ۮq1>HH>a4^a4'v'ZTrLoO`ȷ'.ޞ`w A^ЛɀuJO~-z5ϭCߙ~gsAgǛ[~w /l죰Z9m* i**(p0?p$'VJ)X%rSY-~"`I&݉50&Jn*!'KMl\qml@B2#/V)!q5>hPx(7Bdzw{ΩVOǁNX ]1\3"N<]eX8tЕZ*]1`'CW 7ةpt(7RI~ӄ0\zzTjute`C*9+Vsu_vkӜbKd !;Lmt.xlM;5-蠣m"qw"HqIŴJD9ڜdR@+LJW*0 CZOQɋ|tז!|`@. Yr<7B oqwn\5cB}]ݠW˃n=ac?{wH_1cC>&$egMGYr$ٞWԣՒv A"U|,>'~}iPcrel=eSϸH~wYK@Y[уt79^>\z[r;_o892UK|O566i,ՇEϹB~/HLwɷ9(཰.QY)B2"(Nfb>P}Ϻ3WuCX|Oid'K&0ㆊ n[Nf%L^N'Ts4+?bHuރs!kzq^ng3`46>T6+F8 wkpr3S\I/u~'l~h?volo[C/.Ͻ\`桹((v_PLPɒV~M7Y-nwz)n][_MmW휂jӲfo@ǷQoP[)0~b^KMO9p‡źkoOŽU@~\v,]FVK[ wHq 3%淫`tM/V!:1DTG(Sр7eoE;%qZ"%U$$I7J|Wǀ::%`< vZ;ƆvC| L<8[,ZOU 9c-ulIV :\kPeu^4mnL壗ٮm[dDv\Qk.:i4ObSݱ$"%XX` U~d|3_q&gv1#(V>.[}}EkBJ뎅ReX- D3 Wr*b?Pڋ|zE?%_aFB1JJY.1\V X^z|_NG`퐍]_";_]WiW %_|;6) O`DeU߼ukm]<`\/0|]lF~mͥE DFIޒw,nf;20]who.:)ٛRݝ4E3)^T5'e%7^#>AN5;Rޛ a 2zIb(N$ȤHJ;ɜV<ӝR1jJőwUodPfd4)CB+;YXKڑC;bִ#][%~_Jb-P*DFғY.7iS9GM8*w-E^x#H$k`ܕZoJK# 2;T^K"t=L53,8w~1mkőwg^MVHrpjq[ @WTq"&D X@/l6EM8ڮ]5{)_bA ڻϭ;I֣ J-û_mtNtE#2X)*1(t OYîfyW+JN7TgyC,Pz[rv[L˚nqu{xyud#+L~uHW~/oVI,.Cp% I$y0˅š נ!;cԔ#tI{Sz.B2YMYq"M4;V3̕R,Ʒ|vuȇ"Znbt2fCDzX$9-R$&"q;,$\}Ò) $:zn]Ca"iQryGZ /Ayc#sǰt @%QeD *(, Vcm"Ե"M1\zX_43=tXfo-P&A>Õ֠CUtѶFwUp~<`,9Q3.8cX0StAijx}fL뚚q̘*]zff.~շ?'n~?X7xggnWiM*6j[N8o^|?wo/5|V|{?w?nd]O|lcAOSo퉑WEdx̜V٢L)3EOS-Z#(kM7^ĢM=?ׇzkQʢ(bE7x]w=;h͓[cbUD7d/>Uאh߳puᶟu&C av'ÜuR:?9>!nFc>ĕhC0+Lv(E,{Q]ܮ>dSuנAil,V2o+4$S{i75C F{b, t%P PʹF+)v;~7Yn/1kU_r Zlmu0+1z]P0pHWjqdpQCy[F%B7yTL@p;P6+hl_Nw%v< KQЙw"i0Zx^|P7*;_7x cKN bhsEJՅdDZ$C 1i 8neC !C~b|t%Gן#h[bUE7Yt7^#VBw `W1mqU uЭ\(bkj w !qm!đw[p&:\'l7 '+% MY'R0FRc$ZJQSVÙhT)(3Eq35+%=kV rJۡ2 mL>!gA)guZ/Ip;ӄ_PNo3,ftHO"0OMHpIP2vI>q|iZ?}"O_sk =e<1'|oPrm4;50+ݾ '6V%*3R8+ae"Ux"fEu uU*Dz*yL c 7N#HiN h<׀7x MiZ){!-@9 ~/2xrVJP_`*tӝGDύ3Fo༿o!n"5/ <{ֈGh. /bE#q~oM=xCYRlTRr,N'`T&5u= Ӿ^sEmϲ̌+|1|yYGJ|0/b1MnDzEUi_jqZŔz}8p4و g# l^<` s .O\*bAq|Ө~EUr+-̗-j kC9Nx,I _*%1q r<m$>cުČ,Hd#I`5dj"Zkߢ;V2u6i%phBAbsH-O:HcȻU1F0nww䬘N[_($ѼzK2Eil5;bT#Zh yMvJ6U5Oj%6+UrK{oh+ .4ýMnW54ձѾL%)Z̡.gڽ|ݙ K[7V1 Bʾ 1{܃!2^Bc>$ʪu's!FU&e\ ~9x- ך ' c:~XӦEw'czL[$o9{m + A$*Th>`D<  cr qU.zL`NftnGnnSwKmh(($Ly\2ڋ@Ry(>uEUB 6*ZDd6UcM|zҞgVsO'Ť㴨1Ѧ=m+7GyI8/ 'j>%`k1wξܯ ~v:8'ЂaFRHG,ݗer1䂽eXc[UDwUԃ淅krRsEXq'+aIu\ Q[<*׿<!ַ5 _@Mh°!)d=c;NjL+hW(zbk-\x=#\*WvnbPԈXn{و#>(iFϸLҰ¸u+F(b)Fjnaִ#] cMi1LͮnD, 2ƼP3=\ê[0#p8_G,-˛0lplTOw#R'yAfal1gxo!\aI:UlQuEU=fb*t^('ϨN{ʠk̐!ag1oVC=; 2 ABoVYr@Z6؈wxR{AԸ0-n8;cby-CfxX{t:[ΧCj+̛]1z^ G[šUA F|%?y8r_lϰ edˇ]Mik,Kg&jQZxsa]*vuK4tQ8BC T+]*X]EDtƤ'X`1HcKQd*6`aj$ͅ,$,ω1&!EkLN!X'-RE0X)nj"qsQI6)0k2Y'R=uh~bF!PMQl"$|j0x-@4 QVR0+n xR4bJgpJCVm ,[$d0xO@%d4a ) ZQPʜ.HF*|L_6vn-q\%p\:/jt>2mRQkeH104 J>z?)"Žp 8]$nkf >'u9YhPD4%))9XzY#(4VTr23td{Lآ ]w'?ʯ-Zz4pBdm nIR A TPypCwnĢ|+-6"KmL7 Pj#[\ E];O~Zw~E7y~'(BV״-tڰe8O xqm&;[ìٻ'Pe:Xa> >շ"&I7嗹ywS黛5&Ke'%uY.X,uZpjaaS[|w$-ԯ/F s>|-wnEZ/Q?3bZp?8JBJ`2{o-ߛ|/vV9Z>?;+㦷^Zv怋e}nCqp)<1I©٩YNplHYFrxWK6 Gsi|%"Ot(F{{m@uV+ -1+c`@Y/xoҜlC+C)- !b$ {F7,Z+G"oeZL G)# 8HӟrS٢ jFxW1ۅL笸>{I>Y5?^])5qYS1[UC\TY1BKh,B+%" JRIc ->ǀ*1{|-92Xk8oUZ9j1bM|u ;^q_bM TLĚ]gY $sD<aCޮTVP$I3LwJu_ǻͼqdF䌈T.Ͳxm ʓL:PڷUDkxG[*5R #Fj Pr[_j.n[ObZS4e#vj[ 1Zh#N1k=&y%>oGxP0\&f^:}HTCnM(nDI{^~Yf儸d<,BCS1)ryVzjt3,tx*ddjI$K3R T9iCf8#S d஻f'wB@>jfWToػ1ʳzbͼ -i@9Orb$3!O[&amIv{=.TR %Ȃv>)^xbO|yNgr-ߢw#|gW7^^mW猲?\Gy7_?%>}w | xʻ\GL3&$ Hb{)#{k,M*2]hG[֏ϫ y{k7I|K'~ >\9~A~?l Ȭ"̬J.e$)TeI X# d*j q8n>M&!k[$T k]gLMp`$s:6љUc7.X\'Ɗ-y/$OwLhYy+s>WVMOr\~!OU Wb &T6mPjmrJǺWߢ -燢aSoaT;1,Zx4߆>y,s٧w}Oz7y̞yJQ &lҳwa9q;k% "M4~l%^S;4& ͦ Y5rlSS8ڴEo =:L0S!^p*An{KQ[DpD&ΩⴍK!h-r s@&W8Y}Pu7ƓY ՈGdtn۸Q9E㾾6Oˏe]'Bs¼S+L`šXL]n{=~RrW:Ӻq\%?HNs \5n"MӅI(% 8"GuYN%8^{ G6u60.tx!V|C'n#N.׶R)KSP %14N\3-ORJ%3Ҕ,_3L8"u7GsP39Qp| ɍ'C DžJP}/D&<ɸnd"(ڙgfD%f20CQ62jF>Tc-\ũrL2e4Ʌ),rE`r$IST/mirq%aX- Ҁ#2qXuWdwFG&nYĕLfiԀ>b.UT /N+8a9h"TAArԊR4#DϖR€U).4JE%kאQzWPZIUӰ0Ġ'28 !"=>L*AmcaBM "SFt1e&U1?N&zmt4$)t+e5щg0qmy/G%`[? &T{єͯ-~=OG,>In%X +M2bBpisl _Kp2YEغ07}{ߌֳ\{|7Vq77yN 㵑0VTtϪ,M#C" Ƿ;T4GU-M :GQ(cʼh(ߖ(`,Gm\f(dY,a礜i#7e& GwѿՂw4Qԃ*}4+571DcfM_ߩ%cjr|"'jਔGrs |[qZ{>[;_ Y`ekd,abpOmrkz)}F(J"2`B ~I fcak]ե"bGfࢦ' sg{@6J~6DO|fa)*/ѺÃ!zB'µ"a 1 o{ Tg^ J6rEcn0 $L> 7-u- CɸtzW?Trjת_1X6l !Q0Уۦ9c[b<bܠ!} %J)>p<2"(U*e^`|TqTR-YӆφzN?}6l Hxދ<)1`N~jܞxȔ: NTb1ZؐmkHeNhahiYdx8&XG\!Zm#0{j=Y,OUi3#A:}'נrn'bgc=ɔNhpΜ,Q` Q(* ESm#3pzMNVk6(Ƥ::|6XC|R3%AI4 0P:o2_8]@am0 MҲ~ca6KxD%<PS1ĢTKf+B'UmƏq}HD VT$q>CX,*yШu|w*٠GWݤ `+ZX0bW+ty3.+\?kx2ʋ8A~(nX0^9sW7]ݻw1.~\ $@IVSYV%%yL7im?$܅iEqqAQZSUaᏳO~ܢn6lz})4&ϡGR7Ƕdϱd{_h/ 7]ֿ2ki)~Yϛo_,ǓG"yg q\݌j:wb0O,gӞYo 4+76纱Н65ӦݧZE5p=AH?~?`\PBKkBe^8Rp}w~a~)]LwWv޽;Ρu`MdTpf95N&hjlv;y{Q ӴWlnMʺg 5eM[>26T=)۸P m`TPU"K(vu}ାHö]b7dϽj%"m&Nuo.`9K*XJ& ~ˮ:MQ.Ԙ6Ϫvk~=Ǎov4l>bv\6c֙:?!&_ui,0,@*L2- RJ9T)! zrVt={ 8UƦ:MF؁_.rd&\̨ۧc$@E2i,gd4rA= 0!X´T>MY{ uL~6ޓ4e9_,TlEl "QnY"8BX#`gse 9I8UʝEĴ"Q0PKM. uˈmxIJ1WZ6?n_hѧO\͉.TXpTw]XK?+}FvG ;|1ƦJGUp~F6LJfèjw w[kJ)x,&`nji+XYkuW{SF"=6b|g d(vN3„ R.1wbė.A$FsP'.^{ȓ_m-%}зa%!`,a$*7yաm1}lĈ"{QF ?)*>s4m9YS%| +3I'vQuԜô7*ݙtz{tTY:y#-MȊK҉*L}uyg$+ KLMG@n ⺿E ȝo4jڇS=^N[˜1`9 9#AyI%Y.]0;碿[!xQymmsO{}-5qjs aR6kޓ'ŦT܄TnL>oGkv|saqGꔩmܝp=W+W$F[șl `i,V0Ԣ[pz_09u ֪gANj/ce `3p"Yw1K#v}/u7%cxYh%^ЌjK04=v`?굡50Xoaco+M: J$.*xi% ygj7yBOͅ:bbtle &zAQk=L9Ct))gx0o" Jx,EΉ(,l2.zft6+sL_$,Ksm8iJ&HO3ԇ40D VKArieɯ\ b>O*^P ,1 8i}Ђ_P)x4S]_kA<*@/+rfqu v vc+'Wv1Jbx@͘Oϥ;@=}Xwr^\W5./Z׼8^_|{jLgh=Jr +c著?{ +X=! Ni"r 7DP/CL\\&mr]m'=}gX[pBd`t0!vxBZd1&o0VaبM&0Td% BKI"+ iF;LQ\fcZ&@_m0sVCh{kX`B'4dPjؐ]t0jƒ:1!b派2&mٲ$G$L OY@ȏ >^0ZC( bKKSL8b6!`L&U{*\6G\[A.No@Ѫ,ˈ*u}e]$H5 0-- !N EziDKq5&_6l ܞhElK؄=y3[Vqׯ-b~ s_:6y N=urݐI_MM5AFq,LWc9{Г׮b[YLLex$,(s偍ݫaZn#0ėy&5NσDץduD~6%>_~x~2y]"^=K}~10u߱l30WNLet"0xhk2؊/uN`T}uCO6W%x5K/CHe6<T=]L (ca"'%KuB~b4b4bt!._bӬq,0lڂIAG.Pći#>^\t F9WFF\K%0R҃iU Ll q sa`Y/0OñŶKHa!VVTgq>7-cbq&bvR)]{o7*6P!XK_{a 6-M4Q[;wbόi, ٲw N*HŪzi h) 7F)'-+bgpA?n>Fόwg|k(gYFT?> [?ՈK0j d=)"֎L+G]E%u/u=ܨ=2&c lQP Ss"k^K΅ؑiY**0ר''ogw>MOF9`Oq50NGJhեV{@]yܾFW=1eb*Znj%Pn{T([_D]g>ttO#<<@r?Ln KB#VCyIJd\XYM]NU.wr1 Q`{O@{6+9ڟ[,C_Z-`r'O^Nmos1ܻ^7 ~\jD|ALr1Ʒ;d30\XD\[+o79^ٶof"< Dx,AX%(ƅwYOû]/+oavcO;;hrZ~oV_ bZ lj^4yR2@ݛֹh/7)AAY,ۭ/| U/ Ňpe64גMPsL-߻b%%]Nӯo@aP/g熕ɜ{//oKLpToeK@Uqٺaz{d\D[⡂xp@QX:;>:`3˶`U9ߍ, "u-ޖZMkl^VbT6-X`QGʴidWVs_pg} gw/~~Bƙ%hM<BID f ,J0vf0$OpzޅXϔԥi,t/Tn_q^{tل]lB ZEF Uju?R* L'Cq49ߋAZRچu*Ǥ RУB[ gFg:^G"HR1|h`C?&P٦ۖ#?N<#; *L (u~YVM<8QÈi*h| .AIiQ8xT,#Z =(T.BHF:kn<3u<~C*|v@Mxl>O}z$ثM<8ZbVzb!]VR05Z :iϰ.=:d/aױMƖ#`UfAn7> ^쟣!;8$ %K.1;ά \2O9~Ubr)񷮞F .fGDOů(BHB`9+PJk5*)gE#38&?ayP:ʋ"Kj40͌C;h_0*a\1>bqaA#38<=mry]tj˹ 9H@t9$"\ZdE%vyX^Aj|-?>,GUM0Z>UXHÉs^Rnxu0FfpJ﴾9;0WV2<QęR4N*ɐh$pJ>h;C yfvåm֔]?nwPÕqAHAQ>AF*]* Rb6TPX;Bއh B5m}}X^~NHI3bV24.f1hAxqI+" 2~RfGA[yX1Z LzV_I`#_ٴ>'( :fqc 0cUƄ^yA' K>醞kP\ .!k/+RG+idX(4xӜՄu0zą hDpUe`㹰 ̑oD딂 lI,Q ceerM="5uatfe*ϣO*1 ֣[GQt_a:<'F#AGM~<lQ42@ jP ˠr̸013RHF|I0/b~ÙaH7%(JRGGs-YUQ] 1 Rz$"5ZmN(!qAD"yV3Hܰ MPxvu1 C-42ÞEl ABmӪBZk*QņNil`~'j/5Lgń@h!3tHT`rxDX ݂Rma˅5h;hdG9 <~}XʌV 5eXZkōwH. $pjw h!oI+".FjAWFYnk *ܘ}cQ zoMA-a%b]DYI>h@MݗYn*I{FeHgnQZIfoMTZn ͦwXTN8Q>եXG[tMQUqbM;hd'S42LA8ˆɌvf0zQ#f$Jpׯxn<3:ϭfךBJ  %*IPauixMF;hdCN@n I/gAw MBH=bR%vȄCAR:; ? QQicUz@\;hd'A7Qj#(*}׸_-5PlmnkчϷ'g{tf4LzaL\QAY :m%2a%`5P3+~ˀHؘ42CҗD|& Ffp$W nHi*[͌[R3=`DŻZnZ.Lk7M eBd@ qR C-wM#38&Ud{k4/,U]ޮ=nUZ'LT{fF!#a ̌[$ Y &a]N+#vyۋ{[j=u"#?M'3NBeu;hdGXLQ[I4?mfP[YdIc~uc;JH L] &9kSK13v?gƺ4rBo< O˜Z#ʜ RQ. @.WkJy: VԤA#38 ʻs:*nv~ FgI; "ѽfPwY"h:Ee@DzFfpNXJqhgn6z접%}.McJ#즑]E= aTź̝aiy6NzE!/E( ;[QDV XtK֔*J!_lq<%+YF>3I.5ΐ33{H"349ɀט ;hdG>75ݛ*\f~LQo 3?b8ULodzQ4;hd'Afz2zҙ#ֽf3 xeTL{ Ψ|# Ne:QG HlNǭ $::aJ'0fSbnʇI5Q<id{.]60ed\+Ps`)`28Y}r|ourb0ūis~q+Ոëø&/oVNuƷ_vIqRZX\4D/.xɪvxx0M˶}7=SL%(ww黥7]۟jsxȷ|;Y# os,v_śգ/ujxt!{ i/rG7?+~GE bo_Nҽp Q\޲RN+@Q=M) p7EcUŢ*D^Ϋ# dzXQX`+LTdqOu֌~.S9N镻ۡN~xStky@@!5# 7¶\߷zwu1Bgg c&wةkۺӵ$u;+Ǹn@'f_Qyl:w@J ˝=e}iKg)Af݀[NH(O47TBw0%1VJxd^Œ<)z.~\2VwRPBNg%NmGhO}iu0"3GPBRL])pk3wUgљVJ<0:0o?_?s=zw=:l0000`߭fwA|?ye[ oi遌WCk'%8$)񖡒QdH獨b e󦖵+j3ZED; 7ŖT_! )+¸pn>ǔ)+<52V ڢRr(L%8YŔ9xEr"Jb뜓lkTW2M51|I.p/sWO#\= ܪ^= \}UO}nSa$D=Yp`w ̏6?_k;WIU$ )Gr{Kefs5]E^Uҫi\scsecWџS X G܇T_Xne2YQ u4bD<7GśEX˩o ',Yq7ZOw/haРK<@?h=9.M֣w ^-k6znMٜ3v=a&f}Mq\<`-E? V8m[f]qQ-W axӝ\AR/u@#4j_QO4kbp?w箉vR4`&T_E:zq?yG5 wAe@,F͠صڃn LtA( T^(mTiQXtRɓALI>9d@>HBِ0\Yr"p9B>L5ȝzw !䜻WbV/'ړ9hS׼"~=Y{ [{ٓeR3lwVJLcv5[Wr3`otSz#Chէ8`^?~MՖy4 wYNezW3s\,{^N^<#NZ 9N[k>íyCJms2Ʉ`?m[1K1]moG+Kn_w{Nna` 6}H[DȤBRr gHj$Q&ٲDi6X&5tWW?]]Up缽qwlacov֞WMRc#IW>'yw/4Ыo'm. Ew5Ā,O㹝&/:عm .ZO+@OJ)#~!LdňC pErD J2pΈ(ZBz%=y_C $2x`;-C]g {tjwI$%1>+c#1&C=n-޽Ck Km;+[(N)Y;Rp)vk=]ݍb%]M5xe5et~za5u≪Ξc~_\쮦hu{DznGW&K,ڔs⼚LO']#x %L z/xMhie{gH(9. >m~6X#-E|"dL >GB Ћ4D=3$/ +mrϐPҥ՞b5u[͈GV:xkrZs$gT!xUQ S JV"MC |u}!EzgH(`f Ͼ|"Qx)[Ms1%?#0pϐPCB \#Ss3 %KK{uKɄ2$ƍ[dz mJTϜϑP|JoY<ڥԙX(d`}F,e=K]R>!hsݴV q`m4-4k L- 3=Ǔ1ggED锤FN<FovK(0;9Q& Qp^;4ZD"GQ2$Oa/(MI- rtrT9qVD5ݒ"=3$M6bsk&WZ @}UƄϐP<)9>CB >i EQLf*XB FCc\Ŵ"HA{F>GB VDq9mE&k>CB Z#ƊSNM ť=T FÿCYs$f~qrvN>*XTڦ' VQ)*=!a hю1w44xr9nn&j?ܫ)kY{iU`2(ygTӝxقs4#Q( pX8MqK$2 ˺0&K(qS^1jdYI%`Çs$/.M 8)X4V8pj߳C#xɞ}:6 'ȩ|7 5Ia3@^hg3 H{|5Y@3-֭:pd)6 K_|SBB>iH ƈR{gH(PRYLa|cR$ښ(Oz>GB 6xE-\Rr!F$^JoM %3%Jw<XEh| & jbW=n`Ø=KFs@K 1H P'EPkrZ$ 9]9>@!\ MԞ{^cg އZ .:x+jwRۙ0K8dt:k) uAXgA#9fu&Uc=h.aީۘ}c0eԳZ ne'yxg&ݦW;s|ݻ{,ܦ:S''#Ue|%`,~an xXOj-uڭ4y9/҈ǫ)=OEQ,6 ShO;E7OBGZ}ס&֠ }_>5smg_MPg)A-z^l S7@ZA̼a*im6ӑ!Xd?;% ?Dzǁ= ;2$v4 h`G;тUbG0pɇq[YAj֙o~%1ЍAzH`lFxRN J&a1{C1@ C1@<='Vpog|g7~{˔ߐ}v%3lVu&o{w~C<r 9f9fR:3Jh&Ib1 sEvאcf8r 9f3C!̐cf1)|Pu`zC֡LP@l(:iʴJa,CցG eZ2ʴeZ2ϗicZk&6}Dm瞝89uD}oZ^vf7uu93]y.xZ׀^8O'=:VDwm4x3vW}P C*ϛw.Z֘jRWw"Zj`/ J3j53L#rw/ehGW&eöګ{jrIȄ۫onc|)^eɑt tP?Mg=!ߞK|g7)\7w.2伽x+Tw'oU>v>YC|tkjwfWuRTj{Ym;Nņriv^dQRnQKtR dO]ѯW%-6MSd+ QH)114x-\Gou_BsרS5!ofHR3DIFM&H7wޡ'0Ŭy7&/oNX_ޮA/oZ_ޮ_[_qɍNB0WnB0gc/fNY {޵q$_! Daf F^%B4 {=l҈el@dwTfYy2/ejAUҙ\ww}eih\D=R|:;_^]ѐ<i=եAҚOrmLnǕ5_{w O-6oYzmiܽ6鵍,9n=u:aܐUq2,QO{Q0׳#eb>!<.j]e7Wng3 Y|#yY/W.gCqĚ5.}-;LѾ$Vz7|W-7&-?;YOn+߾Em>b۟v\vsxp){3J%=|K()S|q:+gIdRt\G]?)ݪJhB*7ef/R͚ai|S#͍-DzϽ#BMTDzo%[ZEg$MԵmmlfVHSzL-x~ h`D᭾]=-6.MܻmlUeXDieѥ tFa|vt4FèiLπW7*)$('@չsI%z$iO?\[V1\NG:SAd*X ſ!IBާ$ͣzM59ՑJj MQ Ar#钍5w:>?saM nN';Yn)Ǟ\1c##$6#ftȨ*OM(- VRjH) AF5K!NfuRƗ ZR&bdT jWy(y3@fih!\=+AtbE,N)(J89rt B@,NRl eZA!N ƺMu Rq*NRQUuAT"z_|e) YcIL'HH/>XԠDtP 6C Zb@$F=(aL! #A KwA= }+ĥj*٠db̠JUYa8iB0`ۀ9;oLGy~M^ [ŨuSwmfD># b{PT8xiT MhƳ9D%@ۈقj2AVa1 %OHv9!Lu %tA\xO(z+R|$Q*LFȼb|b1;3]K}4/|OG3|dH֨Vnx$ ;< cx6¬Jr+>XŸs mG5j%eD!v"}Iw>:}>wyv]8rVL)V  XF;KrIn 9(P"QPG݅ZR@aF 5|3 E-`Y+5Qv5+ڱ!," !PVvܬe$+oVQȡ Ѡ;#J&l#ϱ0  tf% Ix2$?< B#jw7ygjތ,*ga1=TV( Aw8+B A1)'jjZ e"M[j!eќ5MVU#YA jfᭇCi*hhǏ u*"1 2߬n$l& -/h5g jk9Nm/`z=nzzvnz`~R+x-i`0u gllfѳq5emPYfkѬ~֚cfԌbѪdPi`EH֔70hlCiF3 _B\ ?RčtXzGIzӠlCi#8 8%/C)z @o"p \T*M .Fgr\UmKcb`XKEv,*f-$*>Cl.Xvk5c4&T>sE}#W2 ՠL-Zܲ^_ TWJN2^.9 /?x85~P֙DoXjyK=iQfm6 {-|.. qAN qy~N I@/ tZ@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N'$'P 8צ8ֹgJ @Z" qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8^H+@ 8_̙@6sw1 @B-H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 r@(_4f9N ua1N ;2qD'P8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@/ zUw.槃W?RS~uqnAokl??gI%1.51.qcqiSq%>zH7hC,K+Fs+F)>IW<#)LZ ]1\Z ]1Z;]1脮^"]i*ݪMVֿnbd}q>Fo_]\/GwKm+DKw)7FWN7a={~Xj_9^ŋhu~zr4?ـrr|xV齁t|sp&.bi71~ˠW]XQUiU\Iax2L =>=9zsɩk)nVdU[hUc4ľSS<ͻ b" v)R~R(H(,i"G\bK+Ft(^ ]R\]õ+FsF +^"҂ cZܤCW=]1+[ ]JǠDW.gu~Qz]{v}VՃ;7\ahWz̊ BWzm% `Ra1tp -^tb^ ]u\] ˡ+CW@k}t(Mzte-)gDWwԋ+/]($b_zjt(R_BWD?Ư_𮟯~m|v3!F6j i('JG(aÜtA.yKOQRJ[&n8cjJ5&z#OG v,AvemGv><>?qxO'Yj=)b bSK0# *&D뱓mU\S1ZWC2Wx.B(Z?Oܡ wJn?|ݫ +ܠ;{S R(.F2\Dm2]R}AJԦtŢw)s>2bSzb;ןrlv?Ouܻqϗu}o<6{mʂ2d٬51<w.m]"H({y H7L:RT2Muh>8ATIk̪< I4IYL ;UI hI$~Rĭt/g9E5.L-ޝ *Qz}֤~gl'P_MdT}AeOf¥}N>\-b.U׽|w*nzr{yIੋz$<|3}+twh%oUNk~7,%"- Twd;^js_̞XJ#X\ .ԍk d/HOAHFݛ"_үN -k">݂'nWF#5^hPGJG= *lT$a$$T=Z}WFUQOCG-5U+TY.;+z&mPZRZJ^B.4Gڀrx;6DX)gI(a2b+p6j24T%P *̲0RHg,20FCWY pa" Zs&)W7)gE"hCppppDVfF/bfL_*99OB× 7b)=pYNs([RXmӉθ ȉ.rYtN Ga62x퉄]NkS6.7#X Gے_P ˛%6oZё:-̛#u{s/&w v^ֹy,,{}~{*8%e\fy"R(XAXxjs;E PeEYhfg=MǣVI O-.sP֗\H ߷e3clU_cPJFڞ=Q='(AL`Dv*jvB:BBJ@ k+]!Z#NWҐ Qš9$_8wOP_'0jP~|7w߼L |# )RNh-M,D)+dfMsAKkS"(٩!5^GO@5 yL\0T60T"IXRaDS\B= pҠps}79uTF" ^+=`c%DT"Y1N!lRlA"K҂DA߱ܰ R+`1LǚjF1 f*oI̷d3IVz.s!RINfC'GX͡W="q4\v:`bęUggE&N\fߔ_^{}\w'ct7&{EX s0:W<$L{_;1el[At|y"$X)`tqoj=E޿jJS鉡 Bӭ:M&Y~|]ŋ8[krΗɗnu-m[q\KgEҟ7n BVʒ\^bb$YYLX@]Pr "~묡Gev/u+Rg\Uʺy#p1K0eWGvh4l1,$_R6~nuu.yO_~ ~^ӋWo?Qf>^ e;b M ͊F(&k=m7colf94d?Fo/ c.pG܅͟ V]?Hp1ʌ\яV?ke2G|'qӉ~`Va 1&U*b-޿;.W)+-oi.8h 3k$F:ѫIkss6+V[h>Mvj,pf~E[ -b\= 8:z,ʇ]`G=Te>fH^HxPTKGs$t=& X_R#Z۞QV )}OAljv ehRCBly~TJi\2fbfƹڪVsIw%]pC2Ҹ;T̼jf]U[ݥ,1@Iա>qNr&:Ce,'ńkt;\ȧZ^&*{@i7rq1a7Uit`gI R|o''wy2%ӝ#(MWi+ VVgYf&z$xZX3M.d '<'9qqR87 ΢_o>X{r68SE8ܵw$u7+Ѭ=]u~-=; uW\|mvT[3CS&7&@s ,R.k5kȂw17ѢwĦ&]Ҥ|F~?!u^TktX4)[wlߢG0aQ=,h~X4<BtpHbjjpRn8=OXt—*8`H'ߣݫ+43̖gGDG75޿~z^\^n7N=E)%H*"`cW۵՜TDRQJJE#͉p7+[D+T Q*Ji+@ sf++/thi;]!J1ҕְVBW^]!J]!]mw5gR ]!Z.NWR:FiS6up7AD+m Qtz!F v CǼnW:fh6C)yn@W6ն]O5yDW2 ]!G]lDWȶTv+fhA FC4EWyCWV^]!x[eՄo o jv@WGHWx@x7벌ۚ00NbZiԶ7Sk #8YU!̪VYQ8YUfO.VHB7tp7thm;]!ʶt(t|"el`#T&ҕ!FY\Xs ]!\|+DZ#(uX #]YJ0XJɡlѶ$S+Ju=%Byhۥt8+6{uC0jC]+J@Wt=J>f3?ں!\j}+D+X QJQk#i6]!ZNWR@WGHWq#BZzCWOm@#lRJP*v_#Y4bÜa: \ G1EO+[LF*"\A}VӶKEDiIG(%͉?Woq(yp#])C7̹? *oqֶ>*P *]!]iRWX:܆pm+@)Yc+~h{CWW_ /*ҕ N++/thuc^$,Փ+Zi%98]mXC7+RW- tmSILzCWWQ_ Zv-]= ]1)yCWzv&NWR@WGHW\*}RW oòqWWWR@WGHW1X9Up!Y8)~ * {-U˓hr Ir-Z"4X]eңkTp|iÆhT(Y. SLRqIGtRwf QZJ)C0BBWV c+)jWs叺BuhE HWFKJGtg *o|W֘4#]Ymi12µЕՖ֫+Dٶl& !`?POr/#KJЯz<v/Њ!`Js|&Ċ</h,6a"vygH$}ǵ ~y4u`4Jq#@_س2DÉ)2ibEY Ok[aIluL%<-~yVgq\ޒqߍI\ia|~}T_v/;8  YI@d PW %r/{x$q8&K6 QdsSyO^Vا?VX-"v+ˋz@ ]r\6z%zr{w{)\ iT*,O?TtFo| wwp=3W?״{G'A1{{Cw,>\Կ>zO%oeySĮFX1w../nnW A#K?Co>MjOsᅳ;তn܅6p/bHg7|cfm9z9_/Ac5ؽG~_v&U~ֳdxAPs'6V޽Xmm@]wS32͞L'5a / -x n>':OD4_VBMF ~O/׏(O#C_Jk ^JV$k^`1КNz+cP5V[cUΛ|fɇvj|X!,Xh{fqB?ct'kףk_BgJg^(䙷~7v}&~A`}>b$&[y?B"}[7ﻧ^{}9tlXi~ >}0zc#7> Rkԑ^;LӺI I7>:қMO5}\A\\zUz9qWlu\A4u\J5Sĕ*XU: XmJ'I-6}wK捜9Giv 4?< 3Fު([R^x佮k@."{Oer*BmPvVU ZEkul/b/U:5q4*t+2XtN2'4a cmO ؑW,vXm0STz9RGb\Ak6X4n+s!_$9}d\ | '݆a*?zpJϸ:t4DGteU\cmT1L3NWJ tGRN~pZ^pj+ViՌĕVKXw r0SlիVNܫyQ76ȍBnޡux* YdUKjzVea^b=ŬjT@Q XY WnXYSr j%mOKO3+ku/Z_beS\sիi .t+P X]Z#+V9_>I\yi~֮X]Z?mXep3NWq= R7bvkWK_hϡsA5څAG00/jJo++quKrʇpD7bF+Vq*quRFH: v\\zԪc0Lt3NWH{_sqł$0A+%i+Vi ⊤V5L;&v99F9lQ2r?# @q:PGVxJٍUd8 _#ㅞN*+ Xf!1u\Aa Z'włC? qE7ժߕ*|W)o =]7`J X|1*ϸ:A\yLWBk[?VrC/b4yw*|q t+m7 NGՓUrpezs.0L.] S{rb/7pef\: W,nprWPk:Xq*R^ O npr0.LW2W'+0#\AJ+zY(=YONtsEk;a*Mquѓb\\Kz;u\ASĕ uXW,7tsRUҌSUz ؑW,׊^pj/`C6_CoυR0zuJg\ DwpRnpr=+ܩUz5qsqnpr]7JK-D:X3NW}2ړyץf}R>fqw׋/./ --i{m~{￿ տ 1e}%fPwuuGן m`:d5Ы5//rֈ|a Rz!W[4n/nn`->ݪiC"A^[+^|q떳= {[?9cٔ^cu}ϧ~w?G-f܋,OŇZl^=0ɪ؈Wc PT0/T>1geJeԄ.3黹{ïf!0e|Yxᄎ @]-.ޗ˃\V(3bg{ϐ#%^-M!uHJth6Lm)IiD*)5 #0i zr@cVKk޵,lM~%|m \B=CŽ^R@hח1׏,1KFK[!!(R4֧D!۪ b##+$V 3_G H"nSPZS)jH)"j%&I)AOZjKU&8l*&QhX'i5/f$^f!)фŠŦ] Y 5\rQ$l,[ Xt=+l )D!; D{%xs"$va*I%CL)˰VAo @ b%'`B

LEw&#%R_D x!Q{%l~CBY'R()%NilB*vN2Z#Q]LI1b#o2Kէ NUDRhH 2oJ1FdYK!۝@VArͨZQ |[4qA3aG%_ݫ]bB\"5 HN+De ՉIJB`B &O 3"vL|,_~3^<7uP ]AӶdz|gt˫ $hKxŏIIJFj:ppi;#M«@hKQhpU`30[$;උ5*%tB\XK(jIV<@HE&rZ!d^1P>8M>)Ù.lKc4.xx$H.:t_R˨ՙ;76d o*XTunT' ;UϳW5 V18&K΄fBNAo.۽{ϗW>ȼȓhOY3Of/F"G!]JS0 5?8j"!p9X]%/o`jI7PR"pc 2'"$Ʈv6 3*$`fJ$)ds  {(o,*"Fp(XTʏjüO(P>&qH4blUNP4eR+x L'J^!gѝ=MU#i%Tf9͠Jj"*VQcUX"tQ}i%g *&)NmmM:vnzs g˴7W뚹+hyE`0ơiFh%hѣй1=E{jw|v9;BGTtk̊Fj98)E.̘4#Hμ\I_E_0<<؇́f'xBdOGoU-ٲݱ 3RdU<7451(!zMC|ЛC> .nxВ]z!S^.e BA #$ P#P)WCX Y_&IBY_Љr\<ٙ)otZYÐ ;"PiQÃRŢ4Q5`zwg :UT.c CE ѳdbNiuKkL`ݑ@jȱ: *\#DqQpΤX`ҧd:$@Z`5{cP*8'tʾA(V>Pp2mpPûePXߩ̌p[V2CjC3@Q$z+y!ΟF)qCw#9>dɠ8w{ 5!|dX2 TqTܭ`~^ 2 w kst!$XU;\<RL;FfJ&)MD!2v]bɮ;X 4 kήHHTvƂG8Ø TGR/\nXG뵸q4Ηk2.Jp6" Ӏީ)jk4^r 1N4M 7&r_Ǜ/#zb7Ji6䬷X`+?x1}͋t?4\ɪ A3z?&Sxtz59;=ž0<ͧ̿l >i:l>SbmDE3Ut|CyzP_˴) Pƶi]øm $&T!~L3UBc"s<$+‰!}$H @h D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$%61(qH Y  I Rq"^f8j('Φdm&aKW#3q6'PQs $oַoKp>&EI /FNsf:7O+xrANF,f Q69ht fVA΅rٚ|&t'p|ZZK]Z Nͣ;}'AVcSELxUkz\FPF~qZYעpySSJ7VxhtLr =#! (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (L0$ (TQ fh(Lc0QZD) cS rgF$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@4Kyl 0C\ *$Hi$@21K$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@~C6x{besqbA PTJh6>1EcN:p 5sHK .pFCg<"w;q<' Jkԡ+ҒzX1ipBq<w*} 4+tWSt+m]iPʍqW_YuD G Bi;twRZN+G@`mѸ+׋cqW?vRaߏ{6=a3^]=L`3|äT6ve,6=xyD vLBq ;w~둶+sN+<*wG@\͎]B)'w ݕС#rW ;duXH9twRn `z=Jq7t6%k7 G_Z! 1Ea>~p7ke J~8&~.K%}le|^,A.sjaBS{~M>ӂ$XXhw:Ol+o WH)c76n9 =V}ʇņظ c4Y.w8E]l7|Z:L>Y?O X \M,.4O9=rx~$Fն_G`Tɮ a|rq;o~ʳ=>Aamۙ}F{Tc_tc'_dz{7BmUmί xmյ .1+pLmj>Xr([}2a%s&Mlr( m8$env67蟷Tmd?WR3_w~kpŏ>Cs}`vYTK0X=F*еb_@D?bxT~޼8TM'EOa| ?wh.1t0yأFk_JHib_򬁻 VheB%wtPۺxs|K (-< HZ@ёlVQ=i҄K7j0D9 #M Q4¿$)!}>wHd-$Ҹd1{a@JIb\}bYq]g"duuCo!QgBDžD;tg h#;b 3XDA_( RB|kr݋Բq̸F (Eh g3A4.BNqJq\hcp{> Xas&۟d&㟺@{7&k7gqC53з,s- w 6Iyf撼463Xb ~Dۉ.d,wQvtor3xZ{mwK|K7ei+ hJr2VW2ZN il8J; r kq:*yh;ѵ(<)\F%曨o8yʸ9\m΃V7>ٰх:obߌvҀFloǍ_ao azƎ^[ymq88Ōo}ҵ|)6/O0,!&^o]j:*~5P{Em{𔙾Pf*2 @3snS7.jI&X&U]m̦iɚ(UR{uu#fB>IW{)YvI%tjG\B5^z) ҤK+ƫ^>ʍ,5DU !P':I3ð&O\-~ K;֍@i?i姼K?A?]YWkz Y*x(8O2JUPhZuEBi&ܧZn` ɂe'Lvv* J¬s + D"h~$k<2×O퍥pl`# /\3iRU;\uXgL,/;Z;0VQGw)*\~[ګF<1eE.eo+Z8Cރ?us+2 RL!>}/p{Lk<AV; (R$L3b"s )/1AjEIRwV(OxF1)mϢf,ٶ{Goͼ]^;zޓB#}i&d{ZFrBsCuUF vJvTO݈6CCO:SC3}C\|K6q,n:q*Is- AV %̣;XcM]}=JCkcqt65iXsApK,nY  F7VetΕ)m Q h -6ъib%N^a wY$gUUnʸOb: 3^{S40_}|ոϰU,ĉn ՚!Q (R3Do Zhx:$ZJB0Ȇ,T"颔\:5{UyW_CjfvBt~Y0AFڃY^) I`g_jUV6e>)bRl7th`6n8EKU np[W-kRցÝ;EPhVy*(y9E Χ`pX>_D`IcQpUX"+ZSUZx0v";*6ʵ5в/>uM5>F;hW"EhoR3.su&B$M8t72n 1cᵱOL٠Fq1Ո id8|EX9DpDL{-_j$1G arR3ys{ey{ji{wK<>?x$MW'A2SL`#0qrN-$L&2L/O7kB@j<%J yM:VP!aԢ2p~U;= *:Uhɢf.?mn]~vv:y#Zu`r ^|FK˿fW\@R֡T9ʾ]+vfp+^lzl0Ҳu=_/a/u+n7@ՆM[2Nj0\U *vIoRz.wF~f;_,!2> jk{w-m#Is;0en$n f3E4$~UÔ0%Q"#8믺UYCٹ7UN#:产ʞ?u/K0"q+pEӨ?yCuǔ *w:ϑ o>z?>#e㋿a;|z-7_@?nh[ףyGs#*<:k=m]6clXv|+ я7_øffۑ. n^vP m(a]ObtF54#ִK5\Ο뮳B h݄u"W\y)l#- cp_y VQd8yg+Pl-7ucE@?DzH@"2N[a2G?+ )O۩if\$RVt 4jI}nKъzR͜1I-k^1a_6rB,0F*d|z'i3:K.FR'ΤBh){gl.}.MZ6u^6x"wي lսJ)=ouVZPqbNòB'pЭ#@X4tX,8b bk-l8>05%jFƣzچlh+ XwhKqKBSi+$,aAR̲:γT msz]AWۡU8tJÚEWf 2-] B+lx0tp (䴥+ƌ6 F` j ]!Z˛NW 3]!`++h0 t(eNe67Nwhu~%=+pLC{Wè:A&_[Lȝ6}Vث:3 ϜW qU"t|>wm-MN-_i޽~қt2'F\G'08,s/LhcልSZtR'Xك|]"6x5\*"Z.T;KB$siЇ$PʳH<92uǽȓ}Q0CǻSL?Nl~˨ ϖlsݙCaJi? F!c !QVj+E0tp 9ѯPVjtJiN0lN"\CC+@h Qj ҕR[]`#QWWP ;ҲN .9 µ"'U%kS+km@"pRXC !?K(Yd PTCPÏ}҂݌w"g` sǴ.VsfixNc&& \B!Nzn?.wݨS4mgyϯ _E[o}7nrLXQZVX[jک//+_]tV;^-u)4PL.̇WEkEgTi%8nПGv(\wh88 Vh_!H޾B1CtPCc_oUΣW`o^fJv[lY!vjlpf..[-|ы7g]$/lعJP -c!|Q?!< :ڝ݌Pv#blK+[(blKY.&׶blωcbLJY8mXwPEpgV%U噅:T:-z`_1xP.,:\T, K˜6v+šjm꿑w@U;'V<,qߺaur<^Y>^*ڑ-7갅s $Js*,hnӸp.&? l2ztQЕ\ôrZmk䮬A\GeJq(?)MH} 7ǘ4Jӕ#R+m{7 aӂ9贰"ګQBk PBQmz^X15J!0dpT(:% |U@Z6|4=p%DӀ <'^tp -M+D)Z:E .+,n8theO_"J)ҕ޽8lXуzT1Y iCbt~wt]!`+ke(tho;B'IW˗gEggɠSgݬ-E ߽x4L>OCG=X{r—fsA C<'ছR-=1OLX"M,i- Dވwu.ݎ>)E˭EUjU:!ŏIkaf3_4^t$Qt=p0ݞ.$cw5+ٿ޿,9N03}Y|sɊGe"U13"!i&)Ir${|Wҿaiu10&Z\3?^}1c)7AI]Rǻ'l pn|(jB׼Yߩ!v}PƯqIj٪7@_no_KrNܿ)'ife4זsN^E [qNb?GW pRRƹLH.eUaGSj1.giTsT#~Ͽ&#SgY?_Ԧ_&f ,ⵣzリ|11A= \DK.X[d|5/bm6V<eƥhLB8%A{_-OR#s)u!l3dx9 b>ڼ0f ~=OA\0B]VB1 BkLKQr%.z`_|z|Bƀa|˾]YЉPx<EQ G/tD7gqwT. $S'a0.Wx 렃ɩfonaN5zVPxOC2&&f-ӥkwi8E,M*24kPA\' 61&8&SH9e GRAU|4Dfi˔;Te8˜RgJhGY*ȃK1:"Xb= y`K1ϜpQ)ՙgIjҚUPmpVAKsSTM-.Sm-pauKT0!^qQ Ku~$Iت##vHI$V,+VƝ_}錯^1wM2ubK =>mhٶVYQ]k[c^22wkL0\E.;j.]z7|{KgH&,=Q/W:gŻ {SR~7z{/rdbc(gg+4aZ%Qxw=U%SV*~]_$#kI`^4QRitANmJ[Iu$Iej4M:i訕4+\YUZժUfkEתd*(-M(;PZ蔋ƷKk[>odž( 8,%L_ʬ[qw`]F3JXb Y.!rBHH!JL;0D;{T+eB&tLaLkΤ3%2S*gy"h>Tw[$vz5c@BYyr}u_?܈}]%{p.#fʵ,# |ܓL&iyJg ∃['?M}˗lb-<]poVֿ9(`.:vtl'lX/(S֒xǥ2Df"f"04Q3#,Jrk)ٮ%׳lqO`y9s=9։¶:j}v`‰p %n54=nmSۡ9׺3NBap({iu*2Evp;S?wjd ځ<\ehkFQ0뛮O ,;KgJ⤊onxp{]u42SO?eTs9GMZ{ E%R>ֺ`xYn>|`iJ,@JIr龜wmI_!eF꧀|:KdX'n 6)9~Ç(#QP1` k*Qls ʡA(E4Jhc g}pG &ռ!I0"JƄ!?q> 2 /0 >BΆGc[3ϲ#@h812r+M$dq t"++XZPjQGncT,L&DBÒ3ѢV8sOrܖ4_=Tp*NO zz漏<8c+jkb_<{?[ps("Vz1ϠzXHB7$Κ(8*!AL8:fbb'c^N5`xɦQՒ^:)gcq!#e`E8G_LH^#cZ;-.b=_~W??oNO/f`k$nM?"-1CC -ͷOqoK~\ڸ~r~Ob)uY”͟KEi6_!IfUf^?i6 ks!fo7Ƶ,*xh .qoV}[}}Ỏ F9$+\Q?(p `7'ic, lBK B(Ɖc?yhtWr~N ƠpAuۣ'pK7c͑`D{ps;:ЄxLh<|ei UF$fEYʃPVeR%c.+̲ۄje[o$_![QO>K,5˷SI#\蜝hNIjr@^!KIF_i^Nd5E=˚'fk ;Iˢ\ѿkxzU9smi)"Ugm\$soiV ݱ4f৶m @)].J} ZwahkbZQvc6|W3ne崞Fu<^9_787 v3[Nl>7x$%pOr%<HZp萴Pi\I!$uJ鿘z{?Lae9~XZG4Q0[MsZ? 9NJ)*IC8Rx@.|ہNzmdr40_uAcf׸Cl -, '81Xv{/Eo9 5N~==ּrOɓV>/nAz)Xcq|+mF8i(Pc*ϒ q~1 o?>W5^zPdu{8h4/ېO`qIc}ZMR%dzUVn5z%|8_3oSkL^QUc5z!Nާ@ټ|oy+9p|v׈6Jsfݍt'̞_=:AAgM'pRotbZqA삫KydGdE ~Mӻ;LoE)_*m( nx_vA{sP~>B ֠b?AQంAI%Xz"Rk?>Z(;YI@*˴$z_i΄SkڦoJUH{@[u8Lp+OK'wڥ,2̂+';bVI)zL2X!bt7b03I0ޒ1 V^xAd/s҉!"&2>ԍ.8&U;٬ӰJyf?gqx">8At9%d[B%7$Q7mEK"C ߂웿M}_-wAKE^;s۱qc=oJ3$b2³ >+ߔޥ] (B.s%#sUPUY(m,ݻdp#c=Zp{ܢ r]]V-F9TL^ Lf;\ Is&L!Y d#Po+9uLJoW2,RJuR1&$OzWM\>MK?[߇ÆʅuM94+Ao]]V8"Cߣ,׫+׷n׎/YzU@xi[vX fWvڂzM㸍NRt8n@DAV W)7#)H`x<& &:ˬTQ'gc!+6 7]nL;4WT}Vr=Z9FAV. {@R}4Nb\w{knZuoE$BLD%͕+A-j+a_..d_J->ePʆbhie#٠uʸɛ2:99OB $}:m7q?NIٶlY΄1$A1CfgC ' ( hV̳uwȭ4ApENX411\l: 0e9k yvoaT|0=8!B$аLS%oNsO*tEx6EӷsE3{#N$AzQ2P14@f\3A]41k;hO<`Es0'<"'rFgthf+%p@œ;cwhbHSrqvmBn?r$h#PKOsV2踉OȐ`:(][z\3yg'!ޛn(<+=IW"@9꒥ s>d״v"YD}޵q#e/m4`R'=䂺~i`i,=⸇oȕdڊ,ΐ 3\ܔv4mjiZ| B2 KF?!GYg=3?Ս{-))̌WZ _ňȹf8`I8f0pjdV?ʼSܟ{w="(b;d bVYB$*j 5`FMz`dyy=R+an@ m{>+,(p\;T:t,X*HY.Mv!K ${4k^D쥜J8AALS@r3#X+.` IL'P!>es}?U0#uVwNҍף,8[ V]]N-U P+ߏ ]NC)j)Ţ(uȐz2Ւ,0/{FJ\~+;7AniC#aNQUDu9 s>?66= YzdzXQ(T;i4jj]V3۾o=~s6>}}ǻN^==Dt|vzV@TeU˂\_WbG￁j4Л-A&{=]S69m;]@S~u޴6flG]szwڎ'en ӭ[%Lo2;nN5i||Jv]2!0Ӏ@]\8\~+25c<#ͣƺ"{!d Qp8Gw mYXAOjg!g"Ĉ2V(CmO?CvqҚcjF3E(ǂc$x)#Rkީttdb gBx3?a;-dž:j \S%3&8㌁gTPlB=ʍr^)[Q;N)pj;VH-9kDܶi ecU'4S`tH(rӒ"kasJ %z~(j3y=6HE $a5^0F)W 騳Y8 f|<[p-wTueYF窬TSH!m)T0Iv 9&lV6|'*4\i)sύ:3ɽ}Uvfj=ɷz ܳ!3E"A {^uƃ=4ZJU(lJ酥 I׶UevGVw ҳaCebct,ȳ4PA=4|twm=8іБC,֏W8|\|?g_4YUDJ3Ln!9 hH$6Ħ!$3CqJ* 2 bh8rr;njo!]+ x$҃02CXņV(̖PEp}}$(WZ\u*QD?a=yFh FCy;9KFF5AЃws[ǟ/۟G ìc;M1C(m8 *\RAs 7~Nwa<"SvНHub___'N޽,,C*}B"sEڜzհf~zDnNOu.u򈲪|W_-{Pxs['ޖ9Иj֖:PkO8ś[j9xco^\N߼ѼOeZN68nOYkwDqG{0 tuT^/2A15ktm|gdg/}J Q֣dvE+d$/w& XQmd{g mPᤥt+y*"A-jhLE710lYsn2A[\ۛ/&P3XXgiK,RX:*"g\2KufKӚE˦D֘MkZii.YivXA22jU|uFP`Ec1KfgJsDQP xs9ìw,I}o0pc0gQEA|{[ESvǙb.|<"Q P\0-=ܴC0ݖ||o/(k#+XΓE* ?] +ܔ]wi۳qb8|9c\|J痞ve:+uJppo+ XJ ٣_ݫ[9HqČ-S@r3#X+.` I S6קYx}pݯ՝thx1p< 0CsWm>dKխlgE0al{|˩p;&T#P?]8|k׭o 46&#hɰ{r#G+WيWЃ];%~X/d^ Nc"pr+I f 忞)1|0_^x 8/X,:M Q5&CrF\|{l]hQgKV8#CQ)*j$ߺ3(0 <]mvo]pߨA7>w/!<Wk0'ʲAwW:W,{e | /peXKc4[(M HetȃKRm RQR(6H5W/>QٹD̙47WԾvIa1h)r ֺ 9E3l hac<%QA h 9R46]\1v=8E.^f%]?z0/?~5vܠ|?{X/Pj\[c "qV E*lagicg'3+Տc .+\퉇x6ޢ:~tޥ TBIFa%Halnu}.R /?[LL5U/_*{jIUXe^=ycv ^A+јF)ёhscW]AJ)p vI f".^%7U~.2dH{%eVVjO~*u t5T;N]R|]y $c)82(ͥvi)sZE`N+vHN,Inf".Hw[جe9#R9t@R).Y!OJe(EIEʆhQb'UrfP4Tqt#42D)59SUR aEFӖ嬵4KAǥ=aT}106BT^zD^K'hcְ*JRcD"Ц< Wu}opj.Fpj%Q 2 qcx 8XU6ڃZn6vRr/5Q4!Ђ Fs̉9n8L 9m4G+Ee7p5j"GB8&DP5[N0ku:K[{[a"|ȥ _`D 0BL7n O`2Ј@l;']ghO<`l޿R-ytNЧiR m-ge x()GnN $R8" !)D1āqXmWgn{\ZbrM3:"}nZ'sbd}\㚖J莖*7mFZoкJz"tI@c7ڍmSqAQ7(t(K?TFel>ƽ೅6SkɷwY_̸Hl~r!jE66I.ln#l{vq#Cϣws|b(ԕ0Vy{aB V(PA;BEmz? U›3/>??Ƶuh2uX~ы~k[ֹ k$#|af*ʸ5Ďwܨxu/OFkVvhUXrAaA:|m-:F c-UѺ;yo*(۴mG]!F#@vT(4Tc4,s-C:w_泚L.)roj@ |DWBC#jx_xՋ1Φ;ޓ΋zLxdVȽgC EχBLDbՕ5TM 4"W\ 6{Vyc&$ ߤqJTgHH8 "V_ ˕Bnij{k4νr ޙk1NN,DҮv,OzF:}ek緼jm]}+t:A+|!A7}/h+xFf-t7d$3mZ ֶ%  ,ХΪ}Yq˒òZ H:磴{ƭ9k[rܛBܽT{M=??#zMHRGHڂ- m|,Dc2w쾩ݫow]C1T)b+t@Hm I9DŽRn&x92J#"( -NZ H^ 8` bfKYErh9Q*"ÝEk_bEk];ckka-7Fgr}qQKK.5Ҥk1ʉkDůi ~ o,7l,wc,K7Kq$"_Lq#4,3Y`T9F$㑅&R/5eDDL ` 6H2&"݆QF0ZD^/ DnxCPNlρfK6ӘG"h@QAsglr1R)iŷ1WU-PoiD;vґׄ8(c֌-FW#;6]QsL} ( AHRVa S띱Vc&ye4zl" Qj4BZ"Z3tWn< goQ[OT 0*+MZSo0 L#E[KH=]gZժwEMsKF|C?̍0 n4I sbʸ)?'];x>L?>E`>\+y ׂL#:?4 MWqU.l֣fQsf73ȕZua2?JpCeAÅv ɘK`%գ^΍As"/i^2Tsq^=?ho>oexZͣwfOt z?Ko=>̓z)Ϯ||n}=MSOFϠ'2jv#*.;vk!I _f_!aF_)*&äXg5ON@PӍ ӅQ^vr^IT4kYh6w7lKIy4ոWǮ0$5AZX,_N99N]`XZroDʵ;Uog5UԧCy{vB})n@╏KGm7 ,3A [[QYJʡ켾);t77j=V4Wx/P!H6^e.DjcX3> IXBg `*(S:$ rt-!\0Gdi%}Œe, ɘV!!: 0$H[XA`Ir,a#ֱ6dl2KZ,i]JUKXBbU<,k v`sD^s׎Sw#d~ƽ駻޴J3=k.ǒʕ_~D%Ңc78b$ApU|@~4\0Ӡl>_񻏥d7}swwB9 ycL%z˼,jƍ"mzԔU`h4+y$4D ra+>ՑXξd~!s5 *3o0;Ѽ*i>O?jδ͆~\ 9V&A|w](*Cz]UfS2ݚ ȺÞVy 8}jP>Z)ZfgjW/QS?(ry9rY,[$3 f e%t\ݟYPX(^CY&6xyU1`ګ TIJ ͯO%mdzhsQud˻%rk.ӽk5mWտ* (:=A3W~~-wnK 0+ۮ3;RU)mQriulPi9< o5b 3٤ۙ|kr1īy]TSimފ*b>+֞فr3(jOޖ'-FN+'D]wvsE\Þ)gt38狞)_q^j+r|I>~bq"VB8% /znsi]\5{>,0>5*ЗүcwCylȭxS+Xe##YqDg-3:HgL"oV a &LI Vi,SYJy>%(ykY˅OZNh%_Lٓzd4Kŀ5w"]BZ&ɺcYVF4O|C.+{ i4Cy½a`$%bFy>!ay@k 7x&kɲ Xn$\ꡜFǤyOufEwZI%HD0 AP$@2qELX20cy~# +>6a@ a84UL`Q茍ƀ#^YW1V6Y?Tgt*D,WXXe)ٻ6,WL~1Y; de /YS$EJlIQ.ev;U\_FP&8FHt(R^Ez_f.avv'Ԉ?.Ͽkٛ7[-Dͥ#!L{uu8%}g{`ָ馤 kŸ_zn2R=ե+xozXuӣq&$^`gX36IL@}!LݙL•|=L^fa YipBs` 맫pSu 黋YE$Ea8~G?ՑlwmjC2P_<^]ͮB3RRG_jB ĉq֊ԕr|q n}ofw6#ޮfz928[3*-̗0.R}Kۥ"O7ɐ?4qa!$"Z']=?, ;PDF2ˣq]d4|zrUxl{j*0ًNBHz0h\#~poԜ>ܳcBhASߚr/A]?^_Ϗxû Lśx5: Lu j)0c_o~zuG揦hBsw{_UڱW 2{?|y='.p}3/ Nai?d6_i]5U޼H5$Ɉjs+_q@m_/^16Hhagt3绘-jc$O``, *0wҎ2Nj\:#o۩+gO]8 K|p:Xr/E 08л#sKXժS|Ed 2I68-:yjbR$g1S:Y9r,m]1M&V6FЎ"NwG߭(jQu9kSޜڑc[-N[tZKq8SJݭyф3$/ {{KpqЩi[55^?_h©-^ 5w7Hч~ W׽&9|?_O0@MW?~l5v`}f)xY`h] ه6&]"ց9/^6S/2" -oXALm : ݺ7~on _ \{!Y'G;ltB>XWU !1_c6GHc8S=}^06 (΄σrw96ҏmi]i[q)-<@)ѝA4wwC䌂ͮbRb*[(jSݦMbX<3Q iP XD(wG( yf!.FygyGrѧ՞=26cV<vaom4POQދt"I,X1 ύ5QhHWHC9RQ,;CoAw|6uBK[Wb6NOg -͑07i·K<=V:㬯t &&VQ) 3@cLWj NVdGsUr{5 ³4?tlf^\*;3-cV8PʸKԙZ>7j4>~_gxh EL@Y*:h5R`҈ITڳOZMۉkR.h5HńRύEFpnyT먳2ޕ󕹙^:R;cC"4*kc&PfTetۨQ1,%! j@ C&1'.hRNV{{[xm);SF(B^fϛ*wt2TX1ce^J0RSt=^\( AHRVa S띱Vc&ye4zl" Qj4BZ"9P OyN_(Ms.3N>ޚ9+[b*V0_M;R ϖ-CW❇]O1mJҭLrԚ֙ivյI`h{לjrӀ.&{ݼ)Zjo.mԼ׌m+mD{#ޜ-|;7CrǛ_Bo&;ӭEBPpZGePK7R洊 VXA9 EsinfZn} @#tQh\#z,X[q#׵ ǫ>=vPp%TC_%sB\M:G{cMP)+q]ghX`vT?bua{@cТjEN xJqT+VFudXDc`Z Hr$!#3hjV 'U|:t@ })([62޼ B`p~@7)}Rzޚ#/SV>ir`!37n=eN0ˍqfKKv7;=9I*g9V65Xr7<=R;"fMZz|⠊R}i@XԨ2;+ʽ4w:wmt='8k7inTv{fSNO?Eʝm6ہ,>eh,`њ''*+Ei>iy5;XO#4?}Ȕt̎O!|@ou3Os/#w>YD%.[ah|~+\ q1GƎ4<0+m!^٠LeL )&%|qB C|̃=.hmc-$)#Akm,sx%E/JiIIDReMM`/3@"/aN(E+fB~Ze@} <$D`G,E f`s T4qkhʵcb8oΒ<zJH>lc5ǘEpH(h"Xϑ@jq&ɴ8#DNY-OYR Liq8%_ڳH 3;bBIxڅwDݴt)%#-C]C6^I -]vݛ܀|Y)M|TG:S ;Fk':PJ:؃X g)(ieW8=!ЎMQɫ&ڿ(It;1@vŕ Ͼ.jyd${F|3H(G,jQt1LQV $WcJlMYt%=}&I_U̥fԉVȾZt 6K,PumxJeM Иw}Jg+_~#iWXK3|9 #YNfGwvJU]@νP6{/UG>l37KmLNfhfҗeu7k)q Lr„2AH,󣞒.H8ν @G0!o#oA !}WER2exU*| \JɴޑЗHTFBCAYreF^dGb2ϤmEZ1P?Z!I-ABqL8ku9jKŎ,ir.E&.LVhA~$A_ N#ZM:Lh ,qP nZ?We4L04I?p7AQly.2$8~F!8\"ϧnj^.}6!ZuS*{!ϧnAO;\Lr r/ZL8x,K"7 F"X {vC{ofWޯu8mT[ݨf8vQbrIrXջr(+XL%(ھLv,@VK+H3.i#S" )0-4yJKäPHB#qƣ*:aݢ?-Mٳ+l9,t+畔!L^S>&j6\reH9e|iйIxvXFtWh|gCk]#Ȣi }LF'l4W6h!%x1X0rRF{s 7B|"܈dINC)Sr[GSڠn '2+OF,Zbqǚq&T9Դ'aO-۰?0GJ9< <hJx>; Vv.Q>? W7p0M4m#H^|y|ʈSHt}J(+BƂF"xcFEF$wn0(n%/N¤?L堬׉~;ߕUn^ /.4=>oze?Ymb/NFqєU/4CQC.D 5qg^3iV.bB bp<s]ca}>"X'o%~QySvxSZlNÔzdp`v FY.OƾJ3I0 x$1 De.`U9S: ^ɜQy!XC^b\R-)$^V(cOj¢?1ev?*sD:t]V^N36zWC?~;w:U e;.!}UYS0F!̵Z*jPbH2T&lR*Ae?ue#}JXXT-+2d{}݇ |?"W&f;Sjԥy<[&le:lfنJN- hcC-)jsR GJC~X:-_FEL1Vz9o|+.q\|v0Z{Y9 5FFGⷑȵ.[G?y6 -%}U"myw>WO멎K>\ӊ1<7/ h4U~|WV,.B81<Œi+SzTO,d!C9^ia(I :0DBxR@)ݲ2i;= |2RH#BG!6& "" !CH!r&'SQxX6DH\;$kbG_9ZHΔtʨT 7| SgyB8S ba5\X18ciUrn#%z m"!~B' Q:tN1ґ0A v" z{ ;n3l?ˆ#Ã2,Og=7OPn|4%aɊB^^og:ё~P*TBb /-spt^F̸o-=8%!:߮# 1RFe:%DG#,J\!c^@Dy b;;ᤵkQ-׋n6DP3nƲMǝC۔? G7Ss#GX&UMB:+d0Ӿ'mqzfY}u{em\ ȻP\ojaR:Uh|0G߫򏹶r$ϰ΁90tyT -ZӔ+}Gp'hS2 @V@s{{mK~tQδCQ}rߵ̹>2zTmwk4+ ̜.;AnRn,r A@r(W)nU_ k d; (m&4'rYd̃`sɳ>Hh\?` RLԼhZXz[ՅɦtӢeRͥ߼U?Benn;^(axブ'-?hUrY=h.;R,xM5zI=Rz5!L\xKh5dͣmjØrY_;Zܵg]w6 턷1T61U=D6}wf9n&7t u^D䀾_opܾpUNOx; t -Cdžx 2ZSʗg(I|q XR;IY HVdRayT \`ACΕcֳSvUYke؉H`1O Bg{P|nTNLk2Z A!Dr z0b )ͩu]&ldYSkI0/ej V{ߡP}YTlש(!M*i0hmiE&!sz4#q.ۙI~~2 mَ匡_.E`hc 'ȋC~Gh5OOIӠMrHr\3>=CF,3%\R|gF)8;Ζ_7[5)&%'D&"h$3:E}j&GuR 0DKRQ[''TJ,aN"WME8N= :Rn)Ijq'v,ڄ)pgvP)@GZ$b r!:KűWM:҄#m4a3^&LqILIʤK&qZ$AʅtҠ [P mEDpm}_?]u"&|ۈuKTPB4nC@O ј x"T]!)KѝxX7{#]qCv,s ^Uٰ?eKVRԷ_@cH䴓iB!䉒MDϒMON31O'9D?{WHP2U@cff0Fc恧|dj1},NI& eWeRH&# fD|)by;5z_F?2/)C~+SƩCH /ÛHlZp~_vyrv˜bĈz1LQoD+:ҳ^g}26|g|$Z၂FVϽ %c>}<ɳiӴ{Q<@Qlڽ׷ޯCO2ſM6 ;郚gwñ_+ Q/e8u&jAj?/:/)P )EL$3DeDYU)[@܃.7i?fuя[V .1S5 5z/}8fgi\ی}Iȕ=t0~E1o?m^F 7!9ZůˮWW~\bz幑C 7=do1ج%y5//&{f혻G9MknXP?BCDcsYWEb8L=G.Cq5*u^2CAJax& 4<~_ x}&/`,uxd A|on%<O ʉҎ~`t|xsi93<]7#2'mb53F}́eUJGV:8g\3&2)JRXeJˤCxT(%ՖS߯P)-F%+Bz`({ZS*7J&RY(U& Y-JʹH&pA>@ ؖm4i>55lJQ17,6u,Ga얺pV5dr/ xF R^>) u*O%`0Ҕqw\mh'>l7+SaYzlJRdm>E<c[FsT1%FAu`}uC<8ԲA:YW[H-&84#+w:>< H6P\(\:::RZttXB>Kɫ (j XE1x9z .H CC"}0H$/X! !OdgAzǣ L&Qd4X]꾽;8!Ѳ:%HDWP+0^gGBAo;}}45X}k}dbkL,.i-!zxhB?zMmB-`kBX 5bF,ԈZ` \PlpƝ )wiX=5Š/+*VY@^9[*,B fK@֥HNI-F3 { &c F3'j`)D>͆"b*C<ω)jEd*3*;LJ ?FsdqjP#ltAJV GЌ$X1-A)^hT$}6;c$EcB:7=nlH8zl\I*J3ٸZ"9;r$$-QE<&.1HYdrY:++9vҝ,ήIg56ƓxORIj4N0O%e!YCJ*,K>`hՂzP+<[⤙vs΂vR􂖣Ӥ8RJ@Z*ec 癡:Kb8aG-ڝWE0㒮Alz;Ax7M[4f%eP-NA*@:i0΁6)۴ގ4Ht%U["|mKT2J5nҧ8A&bNAdG=hGv>&Ӱ868ZԻe^vfDC&$d#%h挘|Q\I03i'\13Bfff>\>d|H!.@|5h'@&C$^ $gvLrvmg1k$UNֆStɛjn8DBWjłTv-Y.=;1r[єqBe3R@8h=J҅A~[a-Z؂s1!+ttXϐ>Kɫ (j XE1x9z .H CC"}0H$/X! !OdgAzǣ FM#P`u^ e+tJHo:oi卓!̇ ZXC﫳K 9Y7P.^nF C4cpf.lo[|s4|o j𾟦ͤm94 Dߤ%F]VtdNP= /%ҝ{L tv t[0cl2K\X۰|ԋ`t%.)Қ I 1#, o#Ą:3b,KZJ^W^B2YWWnGVާ^I/*EwGW\w 1zI>rmף?~ב ]dww^mO^1Q)kR.19Uf*'gDb+Tƕ>(Yw._9N[un/O]LS[ayY#s [PGU9[I%T}{Set,)? N_ul/3y-!%/8ʦnf6ly<\vő)Łecī:ޫ)tPra_/ ,!+y6d*&=6+wJVni Yv҉iEC9*^HpF{*'c &,6X}uv7\s1x\?OTm9g&f4~ya,] ͔wrŒ3/3fe.~;|^;uҜ.B@' }mU KC@0(j8Oy+mTzy|b8Tv'OB~ {י-sHZ3j]lyq4.Z{ͼy߳'l]紿VEw%k W( Q("1)Gee E:XȖcU]Jᆤp &x>k03ѭRny-LdJ'F%Igj&|4{BeS>i0̥8QƉ QNㅐIĥ-jarl-wa͸ NR-ˆ)+%#rƽ?\U<&}){rQљ ۷%3gvevMueؒ-8]In%]?*,8Kg&7e8pQ:˃јA`3=7M/09 )x %C dZpsr%MH?|}*\B>rb=?ۅwZE2%wq޾=]&jeCahz4]x dB+W0=*BNԞb|<=~9i~>>~~z>=Y S(mߖ`7&an _04v547ZZXch!|6%m`s2~0Rs9BoZ_6g!IN&k4QoilX25j>n8 C<XP֯ևWoc1uoZ(ي;՜ViQB3n`'Ӳo2xAu*2E(eAdrzzwr~Aq^#;SB*nORb4O\x#I5{L6(\!i *} {6|PN eK"@PdŬR"Բ·RHgaSJl;HU!#,6H!J_ _DDiZ '^xJ7%jfE74CѸb?iJAG" TuP2&#~٬?cZFz˞qq3liˀ))b>EG))b@#Lbϲ&bIإg&pވ'29k}PH;g^<ᖹ!'vQ M zWI 0gFpR)^Zn ѥȲRx %zTA=Gkm{ChdGpswS Rg-}>E=y'_o)hHm;J%X!+bTRB*VHm^Bj;RB*WRB*VH X!+bTRB*VH X!+bԠbTRB*VH V[ J]Bjy X!+bTR]M\*M_\k/Σ͗R)+˴m 9x)\9PJN,btQ GDe3hKdFb T[!B'n Kb2# TX#2 Gq㹂w}|st": .Љ<95[͕d}s<5BdǨK_x?ɶxsbQ9LdvE} o';^HӴkzw%,Ve\sy";l)>aʘ$bt:HȦ-V6l)lg}AJXx8+MJ›dk2rKkǝ>>]` ѩi$=Fd:g򉲌q$5J4mWc;N~q |sdӶۭa᫫ץkzwzw"XmCH=\c7AJ%Qw 7jǿzvKJX&5}XX cEb`v^ӅA38=:*sBCe+fM̓.:Sʏ^ =$JA2jMJ90Pse J:sZ8Y8ku.3|pana6.&7.g3ՋVGge# ~by _Xw^/mSg.վTYeUn#K‡3+ݬ0M"H RIBڰ@zhF!S4K}:;G^yyܣw6bЅdЍH]z]wO3H2cN"Ǝ0)DrdhHe :FWL;ɬja-Z^1h(-aDgx{4`22e_3,:no SڕkL aD)഑;wEnfѡb :y$SWh q߷rZÐ(6N[b$`/!Ĝ^{-!vVIȜA*IRI|˙0$(Ąk-Nf:! '7GIF&ϬOṈRm>%nxAMYdR rA6٦d>IPv,gڏ ad]=,L& hMJXEDҧV!%'#*Qe:#ө@"TQtȄSְD" g)C&"GGRL"O>PhU\08I7vu/H`c 2 L)!B)+\:- $UщةŻx⅀-w=u4 lN6P\(m^4ٖT )K!qǤJ[wjzѣ]Xdׄo&n:0ΘMO/}PISiqvWN^/{S^)IA d)^DǢf%d"YL)N%8 W[P'\#:ښ*80}T 9n*hHr`]WtF|tUg|$Kb6\WrKȿ2З$!pXC3E2^9yP!)q(Q򬱲<]]U]U]^+w4GS&R!7 /%p`j5^s5-HSR`KrpPl1wG8j験DPa ӎLG eZ3@@FkFMͭH˝X无5fӫx& ։j>xҮhv%A-ܐfJ]}\\0ts o=x2+lu]ƍW\ ZU!AOL6ghcw1x37 Sa2vyu{k] hF^>^Ӻ9]{[\A&߳кMmoy,?5v]jnw]QY1U-٨5l>= f1EYrʠjT`s6fWtd: |yٟ+FMWa<[ߺ6#XUn9cJchH\[7 F ʧ E"[ /vVvoj}n<QVoVN .{4CL\-ڲϺSz?a$~1 HE? zD!18/,\ @ F ,& ҥANECRa~Du S~8!,Ѝ$Sؽ)Hglޕ2_5Iwp٣9Yw h-ROS>RLSNqN~ ,Ls]/ś7yb-IF-p>w/~!7pPzPK*@d J_ %^Fґ0хn%XQy{>П/"88f a r#ȁ1]P-#ROc=  #$$5pF-S/DH[XRA9,a#ֱ.h,SZOi=Ҟ5(:t4}FgRz"O+‘> AD(azrUk&bS D"b>˵a|{ּ\aKU #_oQJ3(|+-u'_U61%#W VW=^ɉNOOfG9UHsARcU/OnʧעT9L~n|S;'3#4V)^g;]}0jm^%C|V:yٌ; 3PLD3]MCڧuedh\(X$WBs.u?66k*`ًYtIs@. =/,YO15->&FTNO5`T5gn0 ?%o/1QWpV`\( v5~wuWS|T-&˯|uh5ɖ-θ)j.9\5;[ts}zv:HOb# /Yֶu-[*K_26/6nectSx\2[qC +A3lnal`b쏑Y2N+gmL"k`-9WZM}昗Q_(LNQ}%Y8IogVz(E'8M)'w>ͿNv J+$T΂#"ᷠE]D)y?dJ`eײP,`r{(L( XD Q03.Q*B\',Yyd`"WDcK$%\2N K9 Tuly8VZ$@ז룯yeQ)d,Ja:_>v\`A|W:^$?HbAyn2DG2(BzDrK=vBhwN)Zr-wpe3ԡz )s|7OHڛ~QlcQ+n(:<R3tny$o k$҂뇩oZ JU 環ǠR) '̧#ͺ:yb⑉bm&J\y0WN:J7u#ky`7ȎN.x\!!/K\܊ twmziZ-eW{#*bVxk,gc{ Dq{mTlf}ikMbwu=TBkh,Pk3zo( 툷1yL]:jDq\OXqoVTZ:]H'Pk9  ,8Y\G#E#I.;.NPQ@8$XzJU@YMO/?ώ{xbޅj7;@h_>.Ӱ?~xNjghݠ$K 袄L Y28 tV7YD!N>:~=A (&кMy,?5NvgGPjnw]QY)Jduˇ2٨5|:**{ d߿*U}^U" 3_N3j<.@MYJ}?zLHUȠ2;D˱ 8gL\QIs ɻ+o=G`Xyu\Bl RLh4Ԑ|SXx^4i7\>. diraN3xQIO&OD$C0!B*Caj3jX$"﵌FMFSYLv5nHcXc"0ec F J#azSCx4>ߩu00$5I 9N`.0 t# Q {+%Rc`gl&p4ǼHhPNvs+ww*GkQ%b\!8>jK ^Ȍ ^`A 3)GyO]p[8)REXPl.;W\n6,ꐛpm8(f=(yU420 D%$@ %^Fґ0хT])(,r!c2(h@L> 'H 8kC*эJiLv XOc/ƴ @@ QAK6)@,iP΁8KFbR.hSZOi=Ҟ5(6DTTQ"%ڨU= AU+woQhEZ}"IWAS`J"b2PqRSf6a%$ezaOKKGPeS-6dJDiR?#fGJ h'#4D!HV:d I(*P j4!13ˌ3pdz0AKp "@bN\gfAmeWai-4ͻL6{H86ҡhgPpt(\jG2U ^H@gȘ)3](vB=):gz9_!ipٙyE>Ǝ1cb`D^-$MR,:Zy~fu[-Zl{BH9JCFVuA*9Ij0B=75o|4dlJ!uh4yX)̏g(W ~ ;K"eWȈJ5b"HT HT0F*CrŘC,ZڢD% =EG9Kw&MD}0/]_nhK4mj>uqn[t/v o"OV9| &{wɱ 'bj Fo}b4lήq@4ΐlj'!(f=@g@c)-DQIHsP%K+"#SSD2csS VNseS!heThiCf ,lIggXv(6O|r܃}fQm݇fx15t9;߳S/mR2}N>M9Ls  ja|(BdJGL2@NԩGmF)-"Ijs3H,Cf tN)-U,nW`ǓhVeg}mÜb^d= 1KLZwcTTKS"4D4&CK{x1'|=M$>2LB1.YENYFX :dG^X/jxXc#kǬyf ;]!76ųfU7c@h{3?X*[ݼt[]ݳ,l;keQ݇4|s*w֝-nԹy]t~ws1f ˾{ gV|孻X=ݯ{"eD[sOvcAǚ4c>J8uio S6ڛ@sVuGqhO=?©+&@df,SR)dG)S(EI֊MVgDET'^XC: 8i m@ j,k\7#I#wYl =:z!t$f:RHNiHq4IuUvqȫՂw߾P|'/4e8~[4b?KOZ|]KJ޶E+293lwwYoՂwW 0Ah9L-3kYֻxl||擼'͡GoXxz?iQoW~nV,rWd8; `5sNhPS|7>_XFd/hӓFiCCsKΐh=4DhՈE3}{.瞛gcp@dƬ%ƨCjH%!ii2(0H$/ѨNFBG͔5së p}rt&^={2LMבz}.gh`JFV7ҰV(jkƈIh-{@ͮa^ 88LGy]?Z4Ak Z 6ҙ߃x[_qiQryZ먛d|^~카Y0T)5a|5\*.@+ec)YzH.K]6l2˥ .VP]j~CUͪ R&zdT>. .Y.YXvXZtJlKwɌtӦ> :wU5XU~vNތJG䮪xU$KkR*ݻ7++}4 ViZwWUJzwݕ& l]Uq=wU=|2X] wWo]^S]zM]ƿF?/sZAALw!@haFZD̕v:|{_ 4Iْ Y j6 )ʨY* ş3^ omB>.Oa`#(V7ʤN.$.SYDeUMRTxzmT S,PW0hjgocd)g s)jA/D2*9LS1}@ *[Hb)1|&KVذSB݆^c$JUr497jtAH2جUtv}rݹ۬oLޘCHY!w$ԚG%2q- Z4ɨP@]JTѮ(uF&6 rZ.}#ڐ.kjPAXG FWeaD. +' F3d *Boݵ0^=$ )L9S-B(0T'5 8.:A0^/L8Έ)=8UO Ɋ׵*t 1b6svڐABv֙8; gY/?-j(0B% ]((!?)i+]*)|Eܛ@&pZw/m(jDfPmAz2F-TH <2Ԫ'IA8;yRϓVOۢ#!GAT#(!hs1*E%P^D$qLrXgVϢwo%Ÿm.aZ(/9*>T2R*@ȓ6d .m]=Hos}XZ̝}mfkz\YٽC[wZ__N.qtyZoE< 9"ܢIn ~P_ux6]*J?v 5=]l坏ڹH[x~>鶷 T[-`6_-n;5?=z!(i!# Ϝ_0B=75o|4dlJ탭{HF[ʾueb%L+aYdD%I't0.&T @Ur_ط¾졛;54Ŷ,Ӧ/|vw]tXHsCZXsLX`/? ?j)R[_#>JeKѸZ}Zht1J^leճgAIP(^0FUСA,@nBr!QιNٲ*@VZ9mΥ"I Z6E"ZEZe򐙲 z0cܙ8rK&báO+3f<9ZcQ2T?4Cw[d~"=7:9YNIgvRխ 2iP* ja|PDBlB ]]R)rN-84%@h3JiIR{% K2k#I2Q(NL zv<fe]2R7pvl7^h#zeNyx9詨qiiMα3Ǟ9@c0N z>%H%|e$bJFQ+`5nk)gk/1"RY,$0(Bl= 1L78M>G`hN4:[jG=<; ,^st/[BcV[^FᝇgkOZ](][oG+_62r_oൽI``B_%(!)zxIix")MX9]SSUUuWU_2 V:Jϐ ߧ;GϟGѫot6f}Jn7đ,&ҺmWޝdl;ύ\-3o"ilt3O|h'/=ThN:tӔoc1O27ScPs[7B g|M6(߸˾.\,9ym9hb-Nppp ;fcKBe ٓN +,~ qr<9\"$UTp;R_ʙ&O\ޥYJ<0TZm}RaVi UyFs(CDF /Q)ɩ_BS((RDÉA헋|^(r*&'1EJnscȭwUeZsD S'zP!|(X[N Tݖ6r[QJ;-؏ڏMM~A_7S (lӻg`jɨ<R1βdNAz/DD02Ƚfc'4v9:4, wZP@K't3< )D%L{++*!~`]H"^GT֡aB~5:thJTp4 Ԡ/KyPHmV$א\x#xeU.ײro+-H/_bOKy$}{L\u!IcNf.}SLZ9a~T=pfݪp WDq&%i EĶS"t5J5QyD1޾y.4JQ#e ie!2aeD)7EΆA39?ݥu,U+taPQ9*.MĔDoOJd+<'uaFKu ggG5\xɃҨ9)XH}[oTFD%z| zW(VKB8^:RփTD#Di.)H`\t$DS%SVcA3g LT(o[7KS-MeGi|:|ˈS$"%I,BƼ-1-6T"=&wɄ0S?VE\ΈP?r7NZbћѱVSD$$欿ʃ2 ̰\WSB %y#xSLM P7x? +rkM嘷v.PfT6&~i3vb-R53xxiX4TN3/,PCJ\+y6enu{D1zCVZZIsd?Nu$O,P||!گz=;&!Ds|vw|Kǿۧ?1~w?S_7(3 ܉ ;poԦq\S3^SO<}1NeKDzgk\)Ȝn~]7|h}6-P͟ v?lvC+Ld];i*n(R狨?ͫCiB4bǼy ̼GZD%:1[SE/,M1WqEq4FeƄYإBH :҇M@?;W Fbp H@_nX+~MI>֗h;<Χ):HCyG9?"qmqS$AXg@x DrW@O<~QZwޞIӔG*jYs¡1OUG`.x:ƿ9czo jİ׉Sf@#44Zh S\+yQ#R0R gD˜qtG̸RN7z\#BF J=>XD` ܙ!N(*7F+My8r=ru]5i;m:,C 2bD'lv ; ԣTpRUqtlS?|G]wu]GoH2zk]P02 : =GޛWy{guKm?E6n ET{/M6.ZTA&yy淸WP4YNŎú5\' @2Q=;һvwZ蓼AQ{VapLLy'w̧Rc1yyo+&.Ǔ [l-;B:Q:ic|8. <5lv5Njg*:;?wI!9kS @=B)VU$Xz2j&SYU3fU}J}B*,O'ĚU'v Txsb>HyB O`_2W\N\ej,2 s͕SB8t}Ƣ;LnJ4+-!# dU&W̎Pvv7WJӘh !irSgyA?L 3 -4/4Wx!:Wo_9!s/N) +)$&2ș zfJ(#2(skx9d(r,` zYUzaS׍ecX+njƼٌ?c_m| ?`G?ޙ҂ٙs&ߧݱo{ŬKAY.I(@ Da>Qeri'(ga#Ҙ )X corQR8A9PccB]I!we}LRFGkXJLEwa!5|cUK|IvhU7xyh__ KkBb*A#RxL4IS}Oّlg\}~68ˣ\4!Rj$VOh"H]8\J& 5QG]s5FUqT*-ڇތ.'7P˿RUVi0JP4giΘs pNv,X(\r&+v )9U"9P.6r:Ԛ"uI{ST)ƄܢR%Ol\95A xRE٢k'_Vп94s+|2 y_d#&J=w6r!=@ kXFOTXDeP -0w6c96zNRZq#BRLj t\)H1L%+&tR J:*Fi&ruٻ޶rW~i ؼ o6v @EW[,i%9^x;N1ԛu5m]0~@WIϝk}cɇ=/NkotwAhi&u-:r奷Ά.Ҝn?cV<[vgl8lmװݧҡ6՗,6?m.ȶpд&{k*Y:Jnh;#q7?D?ρU6?߯< V_WkrGhX7HG0RG+ RRm*5TH6D#e18DHG4ǟ?h>CQ 5P{pJ Mg7|qUhݽv[O];Z;|?*^T*eÕ]4(*V)¢!qPD*$cb)U1H*EPHŦ[2nG)'(PY( Bhek-Γl0}}0͋?mP^op'_.!L p4,K4F+A1TBKhF&J֜-okׄJpXRYﴠ h2y2SKQ2!~`]H"TT֖'˖ZxhJT: Ԡ]NyP8H.8!TG @q(x<ెZNk9혜9LQ0M'mQm8̉a- Kv7&5VsOg4d"/͔!ţuB~PMzED Tde\F`^ㅯ; ;gNsqv\c) ͊h!tδ>' !4j区x19,KD9V FdrRGΣqQ;CaLR8Rq#/,(CB3eI~&y![hWSWqiPk"/b%h?8b掲R\tWJM7&ɤFxr I'TH"2Krn1έՌ~|N̋ppgL) up&9H2vHQb%r8P$H5@g a:*\zϒ@t1*-X Yg5_} ج&6BǯK{JKuPH0(f` S-rj EM]5Z[( =߂!! jHD`u*6˩#PGӪ5A-?vRk'-yn (KA-hHQ"4Č/c,x|)j?5E8N^&_d<}ߋ n&;K \c2aNq[#N h)k9iGv7眘l.#sc}x%HO(! M7GϝѐePIv^$ m٣{,Y]S-F!r-[tލbM"sO8.m D$q'3@|SB^qNWoT! ]I{0D4 wI✻ !csզooV= >; iCk9dBޠH\kJ 1PKG Θ <|]9TȬw \mBء&.U^eq;ߙ` Q#XNg۸ȵ+ͯO}vwx`r>)oXw̦6(W5+iDZrO%/ъ/d{TGnx]RA빲\ ֕T%+@A -qE31Z!X"!ExEU FQP>XRYSlP.e 빱1$SIIψ>1eJ*+D,6NDՊJ .#-UI7Po{ڜ%.<$)| >Z bALI[^ځWB\OR "XDT.j[9 S֢ϗ8Y#EoVbXdJ%71 1>i9 ޠGI R+r'QII;tvCutF@s-U{5v^9N rGԌ mvW.G\ޝηZ;cNKJI&qe{.7 QP‚7bZ ,qU$Nz%H$R ԫA"8N\00ba\:YY"rƅ *&@Iu*uZBa񕶾x5FQcr z߼$#%J>'>q]>kTIޣt[ Z"d*"ShOV$TLAR=G:g~ _rѠBWY ּ5B0U Gj6{aF3܄N 8ȷW[E1`)Q({` DrmmAL&ʌdXG UmsbM O^La OK4s\.v76}+ȷgܤX| $Zi,*H\|5*&hȷ:@3CMP3ę!8sdZJMV-3~!C>ʻy8Kbp[Zb{7gO[^}yq˗×5n9sg2sͲn9.OX-S2S<:ZxB˫^%NV&,Z6w büϘd!m[Be5ڜwj<⤕r$\୤ķC[[ǑEҴp\\pEǯPI3|ĽeF3uRn;\ms7plhg]ظlY *:8\Pr&%ՁJz-+H* !gvAX ނm Q /~uS~l7D!vTYm1kN , pLʵ؝^t 6F .pԎseL6 ДSCB,USAt\_6lJ hD`'y 2gvViֺ3mOE]Mnf_g&y\[{zrK"$w|_,n/BFbq$Gm5#H01Qy&{?ٽNh~9F:ɦQU^:gq#y`(R'] LK1) !tKiigqw7HW?ۏ??|?|L_?~Z珸s)O/Lx36Cs { xɷ-OXG ˮ_踚9oݱS@8U?j/Nd0 NvzFwPF4ߟXoC-B4bÀ@_|~xfj#׫52)ud;p(1m_5)cQG!{ʰ?7:Gv|`0+_)}FrA{' p T F+EI)sy况Fk{G撰Y9Uk:'}㵘>S 3ťJzض2PD)P(ٙQln]i&؛GV< 62!w`~guzW [S+pQ \vWlVQW-m rNa+G.y1jv{7}oݘbxViRM}N{'_*羴^X{73 p_NzMO2sL}=] >[lݙ.l{TNNӺ̎V+6sWVክI)|ꢻ-m>{%mmd+;M]۠3>ruO>XR}[ 6]tTjsvs?~{'G7o6A]|R?7j>Εw^I6Ƹ?W~xj;ܥD-KIl{W7vVw+uրqMVưI|Dc(j ʛrǰLtvOy-|63-93o6+1rxo,gCqĚ-w]h[v}I*]un:[[1.%C6xkb;?{{^㢝bRfd]JzQ즠JΚƝtVΒv(ͥ趚c+! حj!ԹZrS+ fjfX|iA5}o܄yj3`䴺~bZk)EL$?;P7P&]Ta".| XjD#mFM#d ]U~b1*BBqx :r.Dw ~U 1Wm{֩ P2|,ROnU H Bާ zM59ՑJj MQ r#钍5nuľ3:9&F7'ߓscOHHI?cAی U)kJKgBrs;GOkEB.聼0Q9 _ F*USFG>rNc)cJNwp A^k!N#8/ p;o4PR6L2X|%~,dp/C^@[k#2e,qIYgEf\ZB4tGV1-|pJiFm*AN\@Rg=akC ,Tx@5w˽AQlG6NS s;0N-]Ye;S5r Talp`_Q&tn6K 4sXm#d2zS҈_t#fP57]Ґ?;1m2ߺȇskA$6`Ѯi\5<cn-(褱Y50LS0aLN QCFu7 d&/ sAGM-VD\ ̑ 0 :;KP/u: ~Ji0-+_CPqj 13o"J%)`:Bb|Pj]')㠄fNc3h5˳Q Hb$ۃB6rZذ8b!z_4yȠΌ>twukk/F/USW%$' cDUJ=5 !`BB%7指ͰuD@.ϛ0W%sJٷUe cJ.]D]eb:\9)>^b=|A|vC.M߃b:EjC$*jPKB*q`(#(hNo(۽X1z, mM!gT]lJ1wlp 䁈:t T(VWC+`Sܬe$+oVQȡ `:#J^l0m,  3-VUBi5ePzPZG(o<*"|\6cQy8+ 衲BX|"6PH!(ƖDMRu`kVY_c5:Μ&рTf53áRk4Go寂vL jor6e -/mal5k9Nm0e_ W/G0/U[1-]/0seE5 QwC<@pff=+[{ pטD'kgUCŬ]ǰjYkIQ3󬑲FhQi7)/2Ftz6\|YiF٤#vCPDPZN!FBos":YY"S )aT@Ot4<ΘEBvZ8Q|v~y?(7,xt1FJ=Ȗ{=Ob|o㧌~@08QwF +^VnE @H@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D @.%%K഻~gfk?j%KV((%QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(HؙQ @v;>&@âzJ ~+J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%Uu@ow@\(`>pi%tQ@OP 䵊Q@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J'hkV:ܺ>7{~>䭦|}{szCokd~_!p kZ0b+S.]z G;WlnPWlm`UrSyu\s̮uJJWO6/c_yeꅽ۟^mpQ{g֤\l^E6؅fŖm._}JxC_2&ڡln `ӏ>K{HpZM?{bf(qN^]N彝/l\y}Sӌ_˛G gm#^f4-"54,=~xTE'óO4@/IcJSGmN?zNtIZ>ĕz"Лdӷ{g8]˽]o{uɛ.|qyK ,(f/^xQ5R,ٻ6dUgwÀuvd8A OE*<,+_̐"%ϦDۓ<3]]ǯ7KTY+'AܮX10nMSVº b|! '͜i, :0w+b5 Ҙ03Me$̫0f9*JU0MBڳЄވA`tW*F7,p^W㼺8鷋Qo#.3%m.u<72L] 1hϱI^]FUq/S f\Viev uE?_p~LECșLiiFj3r I-:L 瘺33ۿ?O@ ޜ.{aa4U`<95!mWҁ U`{H_ t( u Ňkh|)J<ŭI3J0ofjWiׯ'7狗J O3U+rU&R 31 OV\ (U75yVẟ;L'Wͅl1Tq1z]^cK.O'UWf0~aDDkO馭 G,2{DJGQL|8zg7YqZ#zmV*{8(dNPc8r1XLs|=-// ۅ^;o=|7x}VO00&ᧃtoߡkko5Ul ̀?)+ 2ULo m?4)iz{Cʦ7D׏ n7Mo\m!QΉhJ(pU1௙^1HhbT-vG#O&Kc̖T9`(qEҤ g[Ff`E؟cJRPXT`#"2Nj`ȌR>߀u;uzG QX0E(ǒx)RMVW@Ԥs;:2ӄe%b*θFygxt3qy#O9yF|pnGjǛӏ6~T cWvD@\h7 zʔcbbǀR@#eFdQ0䐃:" zdwΛWb K2msP쒧v 7*-\ͿaTɊNCe 奿K秕X2tie$*#V9:_F̸>qO&&VQ @3@cLi:oaP#Gn/ <;|uA8imT2M $4cq{ E9Yhl06܎z[m3G&)e{YjZmf7^mvۼ1ek6Qե IMy_XPD|0ar@R"2,r2dW(U?:|`2*\Râ[r <9Vloʍ g՛*tRA#F$Y:dlYƴ  Qz &A:RKs6&T5,tIZ'iǔZŽ Rm ?'T/ VI[tJ/D1^(6HyQ%I؇O G>@mO~ؽy\yg7asyt]VϦpDg j):-,Բ 2(Pz)lp+997W%tz]kSDJBy/mk Y|X:K{EUvT=2E͝+E;`6QEt45tPỢW4EEW@.lNGsRYRGE=z&1ϟ[ 7C'1_"Rb2PqRO/pK.iey^LLT5S-{aeTS4uTǎ 3#%Zδ Eސd$=9I+PR2¤\*P j4!13ˌÈɼF-@csmf%Dhϕ`S6޼( MuoK'ۇt\eݐxf8mx%8V0HH# N( JsFZʜV +(g8z!y #cl I0")A: B,`Q좤"eC4(āCO}BPI(a`Ki## FT8G"W BJ(2fll)g\lFMA25E!*/=A"SE1^kЧ cT*)XcDD |RQ[ÄVs4"W+ e(#Ccmxp:p:')r~'u~\xO{7F3iB`5I#AA8 1!rpr h), d㉽^#هwތ1ӈPK{&*|$cMTZ U`+qpfBwҖ/"MGL`OfP695F<QAxFHmpSau<cp0 4",IƝx9;M8d_A8VgفФKS㮚z99/87_.mMDJ%$tq/'0*{lR|\`{n|>?T*x2LO^( &uD%ԖhQjc*91Zom #~ݣx6sE4jjHZϡSƩW逰d~sVk/ԻT<@x9X@Uv=mhk a*WAy*n8JP$ Job1CΊbiJ ³zsz0DW-vA[19W~t=sZfUƶCZ/$eGZ~*MHg0Hxdo/3@*M=JY2Tey Ja>,_h$%NW)4 *%֌˥5D)VMcB)7j<[֑aYqGQ.k%!RHDcD ].iY?R~Z]s׸L;i{M_ |^tv U|+SAW7 LLsV:cd&`1n6FH(*%7LEJ*:h5R]y~(tgFB0ւi i, d#" s" ? #z!Uxd!S>|0z)#"b1h Cʤvg6r`?ƕ/|^keSsZzߍN6[E~=>_QP}dsB3J+#G/l$dӋn`e܍c<-ꪒn`uHź!٪Af_0 Hl%S*fQ3ƒۄ,v&OCu8U;\[kcUgOgJĐBPjGT#g?9Ӽ*:cylu{*'=SJ)+:HE͞ eˢ0)ujCح;T=GY0%/5FB%;[+0l1@@A{aEܫxo]ȎLγz}^cjϯQ{ 솩Vx6%r-iaڨf Z?u\rx+_W54f6zɆIc6RvBE ]fyy-f3r=?o|un׼ o3x'0w=-c^dĆX C_:(U[ϭ͏sΔ{KX}Ä́FO\#JR:c)j|2Z*iTBswIS]E>Kn[H1+0%!Dޛ@j ,GY&+~x?qdr6 9Lf]XOZ`V,e\|g7yu7S.h;tt$t$htt54OƍeHQ9n1H| $  Jz`>Zc'G0 }&;+w"ଏ"W0YkP%:%H/D+v̈́8sY5`g=X̳KIeX!-z|ۓ㕜[J˒kV*zWI.%t )߽K+A'K "k +Mr57i8 ]i06IǺ񅒡%pG$ٖD!IOr#9="3v|a-riܾP-&w07zT!5{Rj_Զ' UAUEw[KȰ(tNiһ(E_XԚ4>29[F1fQ Փ Fa̶I,G)]&%U[3V#gf\R qKu e e\OoJ<ۊ6[&NM<͟ h\?ڿLsmRAzI@Uh0BRK蜃u I34gr`,i`Q&~ҥhDa#frYH9ځ٬p8qKu;[CUeoh I ĂI ӑYEd:WaYIu2}3Q#nF"H>`d9`Ʉ@n YCV$N)ig-a{ϙY%X'JB &XJ$Xj t!1X!GQ=˚b3]')Dhd6uc&Izu [N9;RRtDxjE`⍸-XB("RĘ|'\CʞE',ST}GcQJBNi&$1+$-iN#T#gx#ҡ'ScMvaUN+[BBC6ewf8*--K^>=Q1ˉa6,kΘUhDd&sz5(sH쬒ȜAC'$W;% *Hb<d, ~J'3SvrF(p`Gv,;@B[Ɖ7\RXaeCYL6)'`Be9FΖr՟߁:Q!di+SYELȢ`FX碵O*1JT% tD|^G'u*,NWdD@]p!H&"R ,O>k9IV;?{\cd  YgJ)I2蔕.sɇuu☤T^ċ^do'oc/{Ld1 P. ZlC8rʐ ImҶz'"D8?_JI"&|ۈ`$u`1$MT<h 3 .`N<TRq5yt_!ͤiHY6p閯KWε@-ظ#!2we۪eڢYWk+_PCNE0V+ƛnT* zSWW1(׻۴ɭ}hW=ٳmy0:X ?l \;0=W~%f綡C{cc!#!-PBVTѧ BRb#1 6%LT>*?}¼[ %S%D-r+%}y!:9[VdPMtuCA^o?= }B/+5'Mɤ9tu_=#_P"GB *V; 9%Ix8Ukg+_{T9ibVJi_yG,D-\)+x`\ M8UtD$ ny9ẑ)y1$U.Zg kE+.dG&YϾzר=l}vfv + 0|NmTu3rL. `vet3~zdä1s)f;!lhӮG3<3z:kކsO?~zܛݐ}VjeIА>D6@rFmYe"qoJl$HB7A7P(]6pϠ#I.' ѐK'!6X#IDr$IGRov%OcPĸ faZz۰IA'ϒ<|wFx+:BMOc tPiV]sSoJh 2q!fR4"Xi7atddndv a UJmY]UlO_7WNʉȧ4 RNDy~]\a\H}[`&#ŕ)Ŗ_ǭ|<`C3Yο&r8DyL\''!\$]ɔEf~m/=$qVX.1K42~ir Y%mU8X( |cb=Iݛ[oZE2%? yzt/*ì$I[[.jiZ3sI7d+.Ʉ[(r{k͆cF=ogOnq5O{ws3m+ɾtت; NR=}Uo3 Qy6w= bLah.K>-6zr{ϫũՋl]\uוk5wrcEq(SLH^9/xo8;ǴlbRMFZFEsb g;30i4`O;?dq2""e!#7px(: ^IƲE<)XCOq!,e^V,158',^Y2;NEAfOA2/R/S\R@K`l)C UeSԩNe+iV\=x[Yc$= F' h$M ApZNoB,Q7hߎ pۘǯ'F28JA٣~JY@M U@rxQr%+cJ,ǭbBd1Б;cA"BFTʙ"xy4dBS(S!-CD .5HtܡdL{&! Jy}Bi|ڹxcwz>DX,s*[H4N2ck.9ܓhp?Uig4Z:hjsm)%*e<&f\La͍f%^P ;SJ 'F߁^v!;)jxqli>di)J AY/t„}2 .66v*|yNɘaSm -O? Cj/oOp~)$k<;X@/NY^pt>5-M>G\tnQ/(Yh=k&%qYD'3dM 3(Po Ox9 6] oo;M:y-̂cNAmZxPO 8\Mv5;{6I~ލop~8w溳f ۮ3/W} Gn4g굴B9U'cҜ*躹B)\CseR2$֜*.鸹*JusUTЛwh,tOCQg@t.u^2CAЂ0}@<rAY?C!wD?rί.qG\(l6S'ccL:S]~s/&7wύtZhQ5bяH&o|}s 2%4ĊM: ;L"ҽlvw+>Zf%K{iJRM9S^ʲ9̔Hht AX QR<{edry˘<8W8 BȎ Oa|ۃC} IeTiD 4kйAD`MWBQi2qs9(kn5]4 U4׳\kԠMTc(C`#B4QG`%5h-S#-3%G1&UX wN^j[SC>wV`)wG4 $'ȬC!KK>k;IVOZ!jtp΂vR-9RdNJ)1k!%O&g, GѩjcaCE4M[4f%eP-vA*@:iЅ΁6)k[nHIC=U"]SmDLT舶%@]7S@o_ g1 2erYY֫G=17юZ#dW`EлqX-Ʌ؄4Mܕ 7^؎]!9bf|5%eg'*)98KX?8"鄅ize҈>=Y*`Tq>_09Q{H A?[[F ͔!<g.fR]M.NðsJb9a cR(s, 4p-xk& d"zeLĝ7'2Cd2aݚ QX&%觓m1H^ikDV兔K$2V$i Xc Ғd@͸QB`bAxE"bV3 /uNĜ-g!3Ս\R2.tAX"Rzʐ Z2^kȵsUgZhd;ó_UQō׮huuTķ"O|HrLuyN1Z^GVq5&_7r6f.(1^1RVO`)D>͆"b*d!yS*TfUvA\k]~@⼘-oaыͶ Eyw"qwSmG5: GЬ$NI![Jyt4*Ji>{t;'Ed;eLYkRPgOgJcZs.*Z;h85Yx4`կ=nS4,kzfK|K-TY. W eN02.Ĺ(IPQk {{ ScT1ϩ Kt|TrY:++ʼ":%ja6 2hQ#sApVzUr+jl9Nl[oG7A<-e=c<{k-DMK緸 `Sqc.DsKf/=mٚi'7R V ]mܸ63U}1oQ \0zaiM&+PnnjڛO7eu+?6A:B8כL[9[y4K[z ̞|uÅjсhzM*|ht۔v2my8_9}'֡q<7_盛X7;}cӽ޻r*a+V"lgW fYB‹sO|]?l+,h\/'`k>IS<x},T9BT1iINQbR(GWs&JYΤbT[N-_Rd[#%+x`!G\Ha 0=z E5%p Mų.t6NNgp VyjRY/"W9Ifyɭ- ps*t[B;nhvN QFHY0'1ANQ1%j'Fe&f˸VBYBo j ndKY-W0 |' ~o߸6)kQAeha8 RKƨJB; XJzHu #2`7EN\b6'U0656s !ea,Zl~xˮXjcW*[m[EVY*7: DhkbAMPFctvΫ6N>FaE6>vEԽE-b ce Ti@5If-&G 0E2!A Xol;^zM뼦@)*ўV݉E'ɎH&2`uCVij XE1x9z}`~-_W2Kh j!I,HxY$6jn -Z.7Zi탓1̛-c/߳?r~2[vvzg%Ē x!*E C$lm/Iv_ Au D YA#r(ן&r(=~%Ns^Yύ`ΰ5y^l}|Y[·|irO簫lz Cxx?U=shF#ܮ暤ciRs$ZqD(9hA:d0HXG"\JlB Oc* ,A@A kXi x~4~ɗEק5DzFswMBB8i' FR=:#;}B(i-e SA$p>]Hɣ uL#"(\$$1ޓ#BafXTv$(" @ L#NBWճp04(O`MHYtjpp)6㮪E7Jc,.[3—H{z)/j`gMytt7XɩB3휸FZH͝0xMU^DsTÓ,u|,ݏn煳Pf|gvz۹-ճ3>ݏʖ77ncDDLnR? G#~yG]L>L6ٹ-/NY7kZ\Y Is@_C^}_ 9'8돽>K!p:u7?Շ?gWǫWϯ.>v`0Z =V.3|jA" j ov;`"g]zvhQD%-b2 \w΂@\%RL7ao]_^6d~}jPF0.q/pPJ'Ym>uG?>?{^`_e8n1`t< tUePsUc kYi+RH"+7jPNW~_ۀ("g\!r -%& }+JX[Ey1f>2@`Ds>*θFygxt3qZd =!" ^A/p8%b2,-,R0l\) as&*>״I}'u7,m+bǥw/Yۿ 6=qH&~bbAyn2DG2n!=" 9K-6BTOVSj^ye^/ .h:+V9~ #+5 QaTiGU\qo\V\7Xj,bҹVT*m،kZ]k# ZG:$@fdh՞4#Dz4‚SQ#hGŸo\_ײ'-|g?w`ۃ$ѱ<_x~,je 8|+HԓWT7VsLRZf&h< JQa(,%4?L2%Kۻ/MxڬQDc@J jGU@Qk=ǚNYM4Kҟun:񋒶ָ%-}Th8zDŽ@gš[&0b.eLjd)X=R&R/5eDDL ` k2&"DðUMЧ۱rqm'%NO+ Krkv48{H0swYK7-%-9픀@;5aS:afm"-5U A D4&x._.KfK#GuK5tuz`jⅉbmq&J\^dL5()j$VPbM4003l6q%-E݆Q{-t XӰP( Uc{b0A'}̻@T|Að^;o ;+[3{:&~j64zOC-xoQD+{욇5Mq5agNN@4!׶|B8t=%KoF*Z!I.>.NPQ@8T$XzJU:=]\k 4Cy½a`$cC^^$^?놬oRSk%>cS L0_Jwa&~0uUN1OoMxXUQ.]E2@H9D! r2@'NSwb"I1K$tu^= ̊N  Ux\ȭ?tܤGU¢+7>u&/ei LyT;nW&ztt78( i΀:H>Lr)'_ݓW3QjlšxxŸ僳Mp*lN/v;7㽥zs"@ާQY w 3MCvydB/({݇F9;wѺ~"f]+{:=FХWp,ǡK!p&-3:oW~^]>\\a>x\؁QpHp-jg~ v`izԺaT &UWxe4VtFbqSW"2٧g=S0 *CPRˌ{]+!Ť+6)2Wn7$j jv㗥y8 Q Cl'م?$o!k-:ZP-lI1f %(_.ID}~lyïc?Ff y"B(S&?뇦I^}+uzG 0xD(ǒh)R)e դU^:2 >F{ưG cK=NEuPU*Yǘ3qpJG"kP8*1*}1`prڐ0| 6lBllNF`sӆ"^MSv6OP*ac{:ԕ~ɹ`1X>QAt #Qs P 5\fx۞@qtE͈qQQ*M QDc@Jpd=ե0u-RD袴ڮJm'96j{M۱c$nv}5t#Y/'Ydw<z}Z}K$MRS=CJԅMqƚMN몚ks^{%Zm4;&K f/;u/kȂ 3 /X&w[?E}Ӷe^ ?^f;׼G6xtÓ"qH]",~ŘWtv`oXrفIX7P8NV[7of[=pOdty!b&ɏ,bv2 b2k)xn6hy/nXOh)*֙I}=UZjbO6<7_Ѣf o!I vF铕\(XD|Nު :p湑KBz5OxaU5Ɠj$ߎpܿFuܓ MǪ#l^٤51w,??R4$H#ߍG.wẸ\9ھ0}E朏6|V2մFty|ݯ3ɛA#"7BUxny(m*J;PS"f|W4W;6b,ꓩjLNnͿ׺o~lFL&=mĚ{tog`wfqxm7Nvۍy Y3USF|TR2"k*TR< P* ZsDۭ*ڗv%ˆ7TcѲ>+U5I`O2}qZLn !$ 4qF׉*#$tbBn h->(b$L<!HƢJs*mJڦVHo<_iw7h עs[_CqɉV=@RIGRv\x0 )4ed:kWb!}h̋Hp/g'Lk "D1Z+' ):mrsQq1YSkST,IkYHЎp̀5g)w7ĨR.rbb(g8Qs\[K:JK! Qڛ(a`m@<5 d+Z{()_8}dBy@*Y\m4(,raI@!%@hZFR9U{;_mk`<#Q1J 1ULXö״iM`VćfխD?s&v65_xU&7_e¥W^Nc d/~2x姷Q+a]QnD-y*{#ǔ-ܲ-rWo "xt8gᵇ$t1N~lskzY~Xxms=)F&O"_ӫMJ4%h'+˵R_/xGQZmn=r܌'4qcf A"V MAqR`yH [6rKIz6t:i$J")!gSGiO9?L+Mx*_<[ݢ$爒 ջfU>r;\Bw w5į5Z%a*gx,lUMxzr{<1.ƣiR@TBbT:74NB*O5D!BO֫W?=8ˣ"$H2)JA `$ѱ"5SSjTʤdZzOGwFܖjc֦-5߅-RrvSf;dQ4Y@'i E%Q<Z`J*3΂@)(Brd|pTkƤWܢP%OJ#g?g}r<i]0˶q[cS,dYR!Mx^5Mܤ*q *h[RIMF :FLLE`:s)VKmJ&fqB ]9BDTIMr ҩ _X,FΎc߂8O&:]pKNR6giϖ;LSG}n53+ȋcvSxǓy=L ;"7tUE?k$|:f3&ytՌ.f*kW^k:7]3[?7 ߁-m&\-/loсXr)[§ɤݟvoС6W,6?mΥ h D.6c`h%W/"҄dM)`Ñ_Jd}9x8Rm aG&=#ѩr+Y?l384:&K!&rYzَn< +e[Pw<UaV=j&Ym3zʥs&9* o4GmLJ֖t4$u(&H*ЈD͐ =&Dx#b~a]H2FuԶty1ramԏ?tpmAbTDԅQ#b C QZҔfpy#;jh2Tx(_Ӯ^zIkuZ h;cc@#Vn/*i<e(DS Ad墱SB #e/_q?)2)7+?j'+DF]$T)bѡYPR$%˖<D{ZqG&g*uZDD..AlZS~g' ճirQ R㷒߷vt,)7?y|?ĨA:}fō?j@ 59G#uh{DP:Cm[!(+H a%>Z`.ɍ˛ ۅ\WrT[0a5L~ѝvG A4Co'7n<|3_w]~^i}9椶3; nZU/G5ZM 9h|GPsvY1`r;n ټE/ј7dP3D Hc`k ~\n쇼~aucp\1TP&"\S+uNы쒶1Qj}J )ͅeFI1W~h%z|A+ GCFo:4D,,\\"g^s0?+7motJS ft)e+NL mwe*5r/)t`*Hg*+YW*S[WJzzpep+$XH*vB*%鵫W UN:]ȼFod/{,:kԛ=yli r~hm= }V?d zϣ>1'ի8r/YL*AE=IT ) L i%$UPyd"sdf!p Jnl{K;.!&Ovqc#WY|v[O0Tt2Gh|ǎh|2 zzpňc:Wh"Xg*+;WkWW/8n<\e;Z WhWZWJ=\Db2YiJlիAuJ ,_5_zĄB88Pw+2Qv{)r C5.SPTʛ}2fK_alŦVtO F 1eXxGy)+7~Z|͐7D:B!8bB`D$4}n޾[T\p{(L(j͸s>*θFygxt̃V dh`d =))RR%(T"Ig-s:h6.wlN@ :@h,?lN 3P{V/mB9hs_?unڒMNΣQ*L,`[,(C4z<8NG! om~&of&;Ij!ؕmy[bgo$!Ϳ('z6U-q,ť/.38:<-pY,C"NcʠzGGkQt1cS!5r>K(a`k.PHfdh՞2#`HbaJ|pz86-ة> nZEJfm0#_iyW?^q;̏`|Z l0X&}p/whk뫯ͿǓ׿H==# BO]sJ ؎aO;L׋YjYRՏUv fږz42A}^cGtG*+GUU#$Nl2 H7*Fˤ+ٚYl{@fCl~Hz;(ӷV@yI> N^PZK# 1m`KM1'` ԵpHQ-ozyen_M뺷9@indX#mӼ{,^nz9I9vCʲmd]@}9Zvs2 ]JH-hfg"Z4<&a/O!%sd gKh\?`BBq0=^^5L,av)EԃLjn< i@d&kC%n.Ľ^˝[[5&mgmjR΋NjH(oy$5m421%pՑIhszwl\sel6ixdC;~82AM8t3hަRlʪdGv@9nx0JK:kA}|;}/|6(w'N2~rY$ׄoX3,& PuP{#kT{1aQ5 ĺ#B)gDb)qTqePHO.rr1q͇joSt{cAGйGN^WEoQUK1k%)}}P{ a1-'?:mܡDԌGwKבRf J_up I'5$\"%1@ޝL',%Jbp!ʓH&v;'ACb~5Cbv +~na݇1H**R32mEAcƀwG"V ,쟪L^߸* 8qc3B!hf4SQ#07,^ c9ÕeN nl?XqI!.Ҝ?<7ÑZ$7Ҋe&!YM2klqS$ ԸEZ)^_riTkOU5DE3ֿʔ=a@ZB$DΦRԝ vt{VA ^.kOx048O`M@ZX.\nA wպ]02Jc,~_?'TdI3K0ۿ*Mtqq>/TP4g $;s{!7=H-Yl0{wl35uVuW.|s3j>x 6C%ki8U=TBǛy͕/&wƯB]#9GtQ0чQ0i%GDwcmǻ5j*YF-Bhe$ ́}>M +Ҋ9>ÃIpO [:;7^;}ݛo.}ͻ;LԻ7{߀ }6VNd+r6bdJx雱vXpۛ'7-C`ۢHR+RDIxB: `>F&r7!kD cnQmGs+:2,"1rb4Eb/"D4,% }2Q%I!R[EZxd<#1a'PغFG1 bJ?;ļ6 LLstz9Xɸx1?> .PavX#zOϓN[CPTZ0M0$͂%1`ЀLyDVD#*ϣվ|fr\q3,TuyQhyM3ߵ󅹙_5uKIy~#MWՁ>2S:(QB(p҃ŎIZ*NcF1ZleF͝QFGR*:d2Kp6rvKFl2Ǯik/VlqG(gL.? w92X0ca^JR{{'9Z.JRͽ`Krppl1耢DFtI"4sI AHRVa S띱VcdrhDhnDdƝsu5CG%nz<\]lZG{tRX]*ݵ3WCgȍukz2W}66NhkDn%+&Evޫ Ȁ/^쮭Zu+9jnM~"r;?$篻{ i.0uGǍrZt2KO%˂5o˾% N{)|Z;Tc4G͍]F3 D}n1 Xj\0h i[+߻˾Lpܦ;V*56%?X|klSO3[eys|grՇi`H˨#dB:cq^X*O4aI7+ |$5"t:BQ8 7PNbhVJ\lp30axv K}3WQrZyr("7nDI(09H?R"B+*&+yCvh-R܀P>RL@G"W12#XPL%[3f#gf܎REfBY^UlIodG➫m隰͓_7(h48\cKi垀" ZPJ8Rx $(hs{JO`g4VPF$A5u* +vHUER|$^KٕOϔesX_!~ޜEp jV3I8@`!D,9!UBy@*YhQXr${AX`xHI84a1^&ELqHHIʤM&qAʅҠ ATP܎v;CәŦHהoEgcW֥7>+xlb YrjH̄rJ$[VF!R֘F"qJHadDj'BfIX4tP=+ I:!FR`3s[Bu+%Hdg[0Kߒ; zzLGQ"HqE4|t|}76"w(u[:1Fm!c9\,d;GʷϷo:GȃNPIǢH)g$9,JxS`M,@M΢j:Ĝ<5ThhQ28O*Ġx* .w7 $?koZm<ڔc OxcAJU"h `fwALūg«Ze*[TE-EUU٢*[TelQ-E|-EUUmaelQ-EU`+[TelQ-ZB8sRpRGbxG@b`8-ZjAkldбr,و*TmIê|XaU>s#Z*Vê|Xa<jnêzV*Vê|XaU>ʇUʇU*Vê|XU*Vê|XaU>ʇUM 6aG0rC%nj}s2N_()!saA5`C1 f:$gdg>9͓i礻)]ʧ|^Bqly \?C/ػg/AhmxcC9װ7ry1+B-JVZA? +-T;OOO h}{` ^ίlvjЮp<^l+:\p2KӞ̎0v7 (r4Kh"K)YETiKڹdCY'3,i#"j(h B R`yH [6A ݬX.m?P<n14=Wjj-5v9jO߳32*w UIf#o/476UZߛ}oz,?nfW9tYfﭣV%,HEۆQwW>?ۣ%χmot}{q}ݢ#f^T˩W4eNdgzzn7?o0qgi97R^NS[g(DjAd㢱SB #U[g?NCJfi0%\jPR$f*&ў%:u3?]7HrR(q~ z[`3,~ʸβ>y-uӛ7V,v. Õ>>:L4vǍ UHƒjrOaku! :P!!H SC"}2H$.F! It\TFθI2"(W\ZB]Ddإo&83ok܂&8W/ZnKL(EB@N2ixqE4|t|7+m3,B5"ԉ4G8'l< BVs x\t(z:!.zJ"> -# jQGt l=`PFdHk!੉uDCJybVd$SiR;6]x&Ofg-O1C"ώG7.IS4%/Us~b*^=^F< EJg!V5FyDh9y *ъWb!3'\]m8 m)'Ou@56i1FxF2aQ Gm-nE6$Pt*єjԈhQ2kjG'T}l'TFqgZ|`-e=;g>?2KJY'[IDH4%o]b{m7pMAj6FNӖ͕.w_ B6Ɩ|MͰH'TFQ+h8[ usrsꢓZW&{$hą%N>W};AzyGS ,JΉ*g{s?ӻo|7O_w?o~zGyoWG`}(9X$y'4 Ms#h񒽞z*:}\ډ~5 _'~-S|szΣb> بn׳Iϫ4|F/5\.ס5!1}oi,X9s#c QJ=Ah+pq_n@{ xm:c!~6VF$xyGKifNn-۩.4GXqI1R27Z:uYX¦Uߺx4oye pB&^Rwޤ5LT-{WFdJpvE.0o ,e?8r-J#2e DnUwOUȸ$J&TMYv5~غ0o߭oK[pʷ:NX}@͵>dX3XPy`Aȕ^Ɔxt-K46GOG[8F3X#5SDDr'0N Oa\3מLøfJ6kVƽÆqശtO;D~: B1R[ LVo]"I9ʁd5xM R$J Rʕu)lI}5 >lT%6%_O5gD0]0A,%'htHer}\0r\V FZvG)VWt./kSVEێ2;cr-*&aSP:*$49o]oDsy7OSd8L`ZWX콅'sO.]f[hXDh7çZ,E:ؤx.P31Y輍-t7)l{-:\+V^8u">]p ,+0 ^vĽ :{sVD].[klLJ{Dz|ln~zGzD;Փf]P4G=#+:)atr'[orz;)\ܯ/։~Ս.)>\t͡NPI2q5ĒWIجFi6I!O ~!{/*q0ږY^AS%Wm؛`M E (h7sv,h! WחOOc~s}?eDžyy՗VQz D䩧mpUaR$*ނ+^g Ib2!Z LpDhcy9 nw 8ek-z!h1j Y#Cb#ؐrU L]vO|;{[w}&|>}M}R.fիzZΆ"̲1jd! 2M6($&Y)kE"**'<7>BτtY.Άxvq+->ʴ\Lɻ9- p&4_s/ӗ БfD5}jIעW״ȷyc3G >ޙXʠDs 㐵B'vታ76촸aϸ٥xqgBZ|A3Z7,(>K@T? R~66R~3rD;chuW^ᚚR+z\3vjݕv}9;_􀛋}/{)\y3>WY< Y6$(U_r_nE}V7n&_Υ|>_䜿}9-knCqFE:˕?_{}Dn.z /( lmdOFж|*3ʌ^жY n}z->j*pլvpլ\wN`RdઙO֍&zp+wJJ tઙ TYYW1?!=j:8j֢;\5+ijኝfh>m8kϋW媩>>߽'9m)0ppdpK'G(EcfJd0;Yi:r\" M*7ßM z [ -W2Ԝ86o|/5qZbͰں?tN_X[vJ;vwdZ7j|//Y"xy.i]ǫcn8X&(9Kq[DGŒsy>-)KȒEGr= Ҍ\SkmʦM9 p!Uu.bEjzk]2*i(U#Eu^T'=C1UnP miо|:5cqӛkdX/c1QMSbjĕt^kHeYJx*psy2c||\u>pԀFoJ I#N~Si&9z8()U)[6Z&$%ZW VW} (9SCy#u(S$mY]kCvyi9ד2ZG):[0T(G䠲WJkYf*dJ&֮F}a=sС Tإ@k,:VJj BqCI͜$z~|}yySDqBʚbQn+H%(\۹1c NV:ݐ|FaqG9Fs(J`IZlʘ w°2tVRB)UC"i.%G!bdmmrf1j֡ ).zw1r\`[/n^]Kp݂ۥ|۷F^.E|l>^ʎof#iv˥og/}yu3RGUM[ i W_h!Rmo>z;{XQ7O.gzMѕܣCW<ZzG/c^"{./jс[du , ۿ=u:/%Hwm?vaF@yMbsj|X]Qa8Diz` @h"O!֚UVs niS#|[}Ńa*W[HxvJ[ 'LWs5V<6ۻ4EΓSbv!/P1@!evT^+Fs`M`r5D.V!^r ;K0Ib p@d؞#y9]|jbõLZt':g篢p eN|+/[V[\dF%|bkD}SvsշJZy - (рi Ÿ|h8cpOWcX5R*^lGxyW˱vѱ/jΨ'Ԟ\LJb39$ .FQ %MN<\z9U8R$ (:AA\TJ^`|b.$Au;鼛9p߶m8D6>ED쌈8!℈#7k#*&tm$1  3 j!]I+Jl\գHٶ $M\gDfvD\]ϺU5i{i( @ eɹjuxcVԨr0b5ڮ+B #&O꒲[_P8X%`ԩy7sv Me쥧2rVZmfYdN٭bn+#]ud_ ڡݢV8/P.Z`d<eD̋%*~T ge MI]ְ |J:*},Cg?fΎ~6ހMU/4hUcHUfLT!!dfSr-=guHnlBer.%PG#;XdT".`2tTN锬2Ys%& J7S4Iw;-̂ldjTF+眕@&_39Ɂ!DM*p`5GoWc:6&:û뽈ogL@4½e1N*U aT.8C1r_|YsM1EH\0K%RJ $adb5%mu| ¤V:C" D&V\wmg#}0^/  qLQ2FX?_]ЅjP-vy1Uc i=:'}ƙQz}~$ȈR4cK22Bm |*H$'DA %Ծ5JV,XcKo> zl@Ʌ"x2=ƙlr~HPpNPP'x1\f-3Yt3-;;(ڻ"z(- 68J: ŚYC9 *u%tr\ꍖѓۦÈhUk[N;[>GqoR0#4'(xAcYo"g X!In&W ͼς׀>բURpTIoHҖj"Z~s9` ,i3UܝnnQN42o 7L! 6L`D$+h[c>=&R M=LzbBG&cEdf9g\3<:Uf-d$D X{))qJ7eXZACtDðqM#g5R8J [/Yv͋~y|"{7EG{/TX?$l(C4zUp a!_j7o#Do߉A6~MDlV+ ?b~L H7#ƃ^4Ka)nuv/yLY,f!Ynu OdI*:(:6@,=j-cmRM9}mQ)W:&&VQR`FƌƁZcLWL#,8ZIN]3|(jez \]I8{>6tR D;h8|Z;o$hŬ[bm_7DVsddO_-(ɓq Wa8yfq1@[L$S i,a35+&ȷ]gXo7}f5{v]pH`hQEMkv]@hD6iT>Na\S'2EF#$b;'Aq/# ԅ&Q/#P%V7%H笵aջp zK2,؝v< W&xh]iڃC9HR.+o7`8~)èlc\Wik&lWtmz1 6Q㼋zv@(̻x44 gH" MlWu๋j|5X R='6X0rJ)HzM ʼnk^!^/@w?]4 Πc¢26juG$RΈ S a F^@ϳolTZiQ~JSbą{/9+YlAQJq^{xg n# _SM⹈Xy2Xd>ṢRy6s.i:E;mH56Y䮋.AΑ|:AeݎGwK@Rf  u sq* PwE*itL*H̀Mj%1(b!LVǜR2O`G}(bTTÉNd?:c#1`#`Wj1sǷ}MOu4]+XWa,20Q43Mq)(7,v#5616sFPfP~Q~ 1#.n"g/YDä3IF!QI!60kej6xh\4`Ɲ\')9y5{<΂2_c8;+Pq 8\xB8tV> gEAyrkr oaB,z('|<0>Mn; I9K0fÞ(?h Ea;Hr>vk!z5!YQ5p9ϝw*[wߋ'o'ogP|c{~Q-ws;c(Vr=ɓ0B5Cn$"jGbᑮ!pT;,|C(4${D^u'^׿AG^nwU CH}0r1ȯI|;nB#t*',:^;叟?ӛ|ٛ0QgoӇ7pf`\nXKښ߷"-7OoyкajhC[8}6VneMϸ)뛾p[K0*Jt-c㫯ٰݮr5ݼِɅWQ>]\(:t.?es~Kuy4tB1[S C+N0n+0z `'b, *0wEr'5X?2㋟߀e+uzG QXp5)IKCTH&V1TV{jH% |G{{AbX# DZZ"Ō:n(*bLr88#P\BwXc>`Ɵ#LE>%\ F;'k*؎\p~n;jَy|9eҁjCp9Gѫ҇>ȝt~ m0$gвP}8SLtZ^U <+1 {ίG_Gw|[^Zihߙ\B~aŅy'@E5jyrn܂66%5\&oy进/=HV]sއӺJ]ꗜ &ceѰp@gmd"i@BpMmSQ}8S1}a#A2rVk/;^s%(q*qڲӳdwVw%fѹKV| H9 D`4Dd}D dW`)F\%r5:q ]\%*h W y''^Ớ/{glNѴ3ZN00Q() :8={e4F˘" otRAjUg|ZVs KNYg˵J_G &uLfhicaLm[ SKΫ0Ձ:3w~߂jB`F`D.#\;HTJbH];,XZ`,-ZN]\%*E+^pPC2 TJWZ]\%*ykiyJ?|@*+ W1 'jwq=KWJ ILK~0JiYJ*Q9W/F\i. /c,gos*/*# XUf"ST ʛ+,u"$YELV0F*$`CN1uXJBr.(_-@QqONxbO}BP:0IWf|1L//S8T?G cNכWGkzm 蝙x)݂$ט7??1ր{Q)IFC8fx9PKM랿L{zCvzCvyCZ6_M"M{/ (,Dy :5G-u*;O^"C0!B D[X1c@hFs+%Y3B!rӦ>mP1+hF oW, U)5|WT-,*1xce┼P+Nᵻ 'ye5te"V5NeC\WúTc w1E,/YsLu<+0Y;HnM~7QѶyr~ KY<z+:&\Yb;.;K#MYzIK&[δBT퀫1!jnm4DiML+l۪GZUWEwUͻOf0ج=}Ů|N/YlYjfq)Ps)@Mw$ o/qTAYC;dx 0fp,8 h%$'?Ŗ&[iKN:@[juX&Jqvp!-tl%d2i PX@/OSLLƂ9L{ saflBiv=HE+Koi<.hJ\+o҃EQe0t0wPUQP,1%LbmbD A8[r.1 KN\|(R0XFX:1VR5c3r׌wf.ԍu!IuᕭYC //.؎ ګAZP- #5rΔ4krA) 0:"[9U!P16'ÚmawO8_=7%Qɺ9k8Khfܱ6`l CJJ Ij"h8 KA-9-R44ՇERtZ"; %Y5D2@aH1]a8W{6ևȹ_N}X5;ThfF54T#z'0V!JbTT=sLkMZE.% VbZiF~a}u6-Zem<^EZ}p v~aHƃȕBè {V=pS[z^<\画%>6XBaq*Γ%e!Y>q6@ݿt\oNk햴 oDԚTrxVeBșbAYO+1w-(*Qmͦm8k%lBlTqH.z!<B/#cdQfl:tZovռ}9.LtJqئlvғ/}f qǡzi3FR `}1Œc FɻM4J,XfgM`$m&YѶGH gZky 2E,u9J]pJH+o$<(AYP`$# GzLJPj WuDR"xXΚr7 f=NbTe!HB5{?g42JUf{ -Eӷz=ch^\  %B􁁇ByI1k58IVZ?i.wOJ&< `+m* ȀZkdG{LB*DDbAq4QsZKP3*Sj$B{ob\(E.ZTϋ*A;vl'ikaf}^37D;o[.Uxۥkz QV Cn]qtqʢV͓M6=%f`w\҃Hڦؽy-۬%TP#CPIYkd'!dOJl*Zd9f]|uh .r_R4=^GU>d1!SFE Q__Ui[@l )~gOuႿ.{[n1=Zol# U}!՟"uY<=l"{C\.% ӊrhM{g,L2'^ 񌨤#ߨmARPJڦ(%(CR>PEh4A DQq@5V%" "kZ-R$󮞤y'c3Cg_!#iоEHGlb{t|a٫kw)v~s%:je!$'N shh1=Ǡ(e̱(N0 yxFhJ ʴkmS QecP`QRd Ee)@)b ,L.ԓvg3r61!u_ɖ97O;/{m+t]]8[-ԏ9S#Z 5CJfT>Q)]-.B%z-vaO2*mK'/u.oKi29u .IRt2*P$Zg4#~ g}r\Pu^| 煮tO,ޯ<y7O&rR^cRᢴHAb|pCBF* cTK<QNףg{TtTiS#*.%9y ^22FԈu-]T6ɶŽ,Gc܏+=][dzՍ6Ia˸[p߻kgI;xoZ\tqnБB~ztt|3˜ׅf'$VLt0&~DD }nHd 颱Vmy0q%x1@)QjRVd(Ő$'HeSsݜ9tq;0֜ŕ?J gGG* ] ֮:Rjk3R}]9kkOc8i2?nqxؤԶ kU`όPQbЕ$3z={T(PnRBH $jHEդ!r2{:3"mAmm_gj'$EN>IGoνP瘄3eJt :eq4k.];*Ojj5uV~쳶}0cHpwc\0_W fpw^oN6ǫ+ABɂFarItX0ws-k?L3?'.kq4}Ӗxn0/j=>9* .k( 0aTQ,4FurAոۆIiZ$ -/|XE4Leԓ@7#gÃ|OwoczjkpA\gAƔ6IQʰoǮm)rҦg1*ـaHƒ_uKyPWE_Le2KtZֱ'٩,#JiY"( QJEsN^Py"~2:Mׅ֝܄ N4DU7 HQ/bUHlEjk4=iڨ}v^MUF v}R yt8S0R[Aڦ4qwx"haqbZ٘Fx~3Eĥ~@h+ųq^Z{jLFDi1~VЌX\c`1* Ӑ/k f> {q<6kd+H@ N /A88Ja>8r0_<9ίHJ<%!(Alvr\2_xAӇhKi_Ӑ)hBHݫetzErJ?~2Grkdzp mtzmS3FKobF8bB_T9hߏf_p?W~/9[28eu> Oh]K"f/gۣ0X1M~@!֍fHod0~-sYޱ fL{+XWZdec/^?9wQe ~yѠԁ5ߧT}6gUF5"JtqCㅐw<0Ok(1;ן}~͇ן> ۿ7lu+Tk%=>  [ 㭆V7TY/qoS_Ew,ӅY_~3;)e-߅?_r?LQ^}fq;YGVÊ/bjƸNqIUc XmiA+{wHW*ꦥPTZjqF{6nc#0~vA#TYFN_ga~tzWJ:Kh)gh"dJ=/jIűb$ Br.ap;:҄{wvAxuʑQe+U1@Z jԈSrh*8lB5لګvxi'Vԋo}tm5 8[& =:vHM^v[N-*رҁȝg1Ƶ0 H rΓGEE8^H9h) Վ A+A+/խvf@)* : tF y(5Tm@%JYf)f `D:\FmLp9]f Y4ͬ_d= .c2Fh**#ȌUp;oxC0#cF@E NSXyK;r{8{;j^a=<QAT*3{6͟b耣7AcG#NQŽGB8b9X"z( e-sR'nhcзEj4zuVs}i}0vz|G#(#K8bx؃3CN4uc=Z#D؉DHk\m  oF0 v9'Kc$N2UƃiylCIwfșzIީ9xMIӔ~wp՘?SPYMVy&4^3JZ'&z <&MN4$xFDO$kN TL"KMel2M_N.dJbŽ܃PPFcB @]8f:1ޔwFZ L"5Lå/?+d=ijArHrÜסp7쮧wL3Vaj}cPǯǼݿIL$Ds p xi DD×y?}hsVkz˱@Fia){NlTA#O 1a RΑ\%*tkwِq7 x7Us `LXTF.RqgDb)qTAIB#z]jI˱"D`GP1,@DBZDۡb*A DJK7*FjjfĴWoP\ "9E{${$@;W"+ QtoU^LT-+E)|U"XW\E\%juP,lz;JS$#qc?*ؾDj $SߤJW^`ty ++3e 9)ɝO|+x.eY"V;GJQ&GY}D-{>Wg<-/Cht?}%GelXa+~ǺKfTC35 fA% ɄD">UEnMnC VkKgdaRz?f29(cӟl0.y@_49ͽs0] _ǟ{o.$hٲ[9vM5=D9Qj c6 9-2JI 5d(O+LwhH$ D YJ b`s T4qk΀60PDsèQHD9DŽRnTȹMXGEd qGkZ )$8$/fK+;^i}t g/oy!"q E|gt|{ύ3,7nw}:2kɢÞi`QB\򄸪u quCh$0 ((kfS% kArOȈ$ԧ۞(1*ō8` iU"b @0CJRa(9$_r$n9{zyU!,/^ѧH/>YqdB 9ZkyUxYxu;ۂ#6ԃA:vP!݆ұc:g5+/aHn0V˵reG[e3=z=k#ުQ6BVR4D_7Ȓ,UQP Yllu;dyC`6rEҭ$R3)0pZ( `/<wkYh1rL>[3IQ2Za*5Uqsc5_DΗ>=>dʹ,|\;:F8i'1h0 @GwKa Y/Hϭux$YT z`REitL*t0(QiK`{?HdSL0nro8$560kEare\8`&<4T%)yS|<=e2ٿ&ɕb@2N!&+tx&a0uf2 vp{8< eo`4(Ox\:rrp3r- Hߝw&O* cu6:?8dE]\#{R5:^>hJ@π;t09tkՂ&/H8sBJkoOe^Q=\[Q)/xw==/gKI3b.0gR}7s"Xjß ?7!T-1˺fH}363X#02 F0bGm5Г6ժuUȞ:I'08oz={"~qoTy! 0f@.?^??f g+uE4l甼qEIZ K|p:Er,9QL$N)c&Jd9F[-Hql֩ȓ!J$g16S:YwN,6&6ʳxX}-TҽNKKpUwe01zJpЩzPQ9mAN.SAڙwixNhi\[g|9[-I/!:VSOtL2 hv%l~ (S߻9%~܇\9:߆A{_ bn^ G7m+Ѿ湕ޘt0SיPPH l gB͠Y';]r4bFͬJ; 4@ΐ@;T rimlP*D(L!2 ,%& Ju^6bX<3Q iP XD(wG( yf!YC9,Rg2D4+1%.DpC PŜ(a=9A&G«7.@A8Qfxѷa2f¹,ʗ=gD<9VrN+c%k:%CArQ)?!$"Py OL8./Q5BS=P2'Xస:V D9Ew#,}DNp֛b?zFv Awdw(  ; xnra717Pw>4}?1(k\:S}2`|(Nk啊B8h660 o>q 2lчp ~o@\J?#^Mmj¼zwlϛmMp>;\`C*gmAV͎Τ 缫ٿL|E#} ]S@Y`Q{f*|=_~TNG^xXuũ= ^u7gz|.1{>8 5aY[p-e=jM ~s2)VlM2>s\MUxS'H`bSaHk*ViK%PZRbϡB*@9 ؒ"qGj&Ĝ]5T\06y&d^@ZXlcM8mH䇔v1skXIMiC4wzkTU"U`vdćO#z!QL΄Y♉ܤa>r[ K=BlqťTǜm#:89rq} /{c/V`%EUS #J|FSЩB j:R{)WR)՞-{k]1hԂ4A c wZ^Dn 1>aSmn1X2gzrR9Lo_^2ͻc/D1-((ѿrj<3+m^UiI򅧳^ez6|?;D>'bjMڢC$C褢4Z6c|rIlG%֩mjV4\WɧVzAbHLa:֪K2iQЍ׋O+9}/?}:xAB& ɂDF eE^)U2Hfl&fJ}+cĜ|MZ΋\1'ƪciW=WeUZv%e}m|X ,̅Xjz޻Ţύrq­AnR6n8ύ.79aOZZaٺX~ zn4!ZЂX6/~iK?ro*vZt`ŜkzK*Օi)[+?vf?GDE~-rc̓ gZy t63)qȐ`m!BA˅-"!L;=+\Y`>U P`AKl Nmæ=ū'/*xǭ2lO|tƜܕ'O9TW'Ԁ}(9\ȣ.XW1))25V~JժJ^SKڊ ;MѥZ h0Kn٣jvwZgɎvK7;yA8=|k%G{T%ve#\]i)@\ߤའgiM#,*10#L%qj'*EfC%=dNY@72vndR^ YcuB;bUͯ/u g\-o'#vlEtud$ЊVc<@<8Giڷ.D`+D&-ۂIX=MKeuGýqzFׂcAnXQۏ=2mR5fR.bdTBI$|5ac>zb20c)ՙx·\0:m%fDj78 F "Au nٍ~x݀X}<CgD #"5r (,d1jzcA-(EH΂:9CZDWDLQ+僕ظ: :(-i8%ڶXDAbΈm8~ascZf%EНq18QCUo]):n~߶S^:WjXk.fQq` c J&n<6j/mт_o1_d?VY=ooWV.~T:RTF)Q?TDCT>fLV&_59SԵ,ʀx=YhqBLQ%b0JuGg9RɎ%јWC\ IZh$q:0eed:vRΥwnٽ8%8Wt]P.v9˳ɸ[7m4TsORTDCsU2P*`R>2VUMvq{{ {Qޯ"|iMv$`)`7"{߼P|uR`x|:fu$1Ck4|Ͼxjqj˃-* 0 l\ۻ3)=һH|װX BIWD4p"6)4-:rһVהzU`:_u0v}qXVW:c Q=P&TOk3:_z{N.J8$`5G@5k1}0[Ok8N&?*;}brMS"DG1!q&F}1X#STH}H0 Bf1)S}GcrbvӅTij266/{铄Ϸ58 (:(UI`:Tޖ6a6A`D B0~`Pb*J &#Shrl(z3 rd eqb-p2v3=CGyu# /(?غ۰հCZ.yDVgLPyl ЄL)hK N锫09(G<'g#֐'GUr>)V:J>uF]YK$T5B" U(P 4ˤ[[x^Y%P!4%zB% ,[ڈG9Ew op֛b?zFv AwdwRojt {0Vs.V35:I!8`mdi 82&XఁG1Eqpb8ٳsMkVrU8~ws*f@8̅ևs^aM* ` JoW80Ro|½U_p;Rsz|. \ (҃m \L`nd2 $Ub\鲝d%}O$OOw}©OGw cd#q+ č>Y+@J )&P3Vz&(4,yYL5cv` @N(3ɇdɱTqJ5r6LX'u+P{Pҏ m򳒮tߢX& ܮ xIrM<ʑhBLB%/]#+a*C71%dB Gn~&.'J~:/EH5 c(%yLRmYC5"F&^Xyl= E1bHº X+@ 9d^ӻɊd̗2?j[ښ6$DdTHtor,_&j\`9d@גTR:I˃ Z[R@7 )%r!eːޱ'x)Dt*ҕ eh9/ N%KKSA[A~^"xG\q{FJ%1f#i޿, *0`&~rY\i}O/py6[[pk{QD6r_{}dNBwItu}:osYޒ &ZE+XVjГ{UNw]|>[ְ22ޏX)L9޽:S0+D6yk8k5 ":VOkg'7}yjU%  _N6?U'VJiSIc@|㆒ ƠA'|)<$@q8fKh\#ރCs{ڥic+JRe)`T<@mV23 QyʖRR%dmb.+pS`g"VWwΞ0-O;chu胲A҂} C:z;{l;-rYpӵ[0vuiQݛM~2s`qŋZ\ͺwd|O?Лe[]{⽻f2Gػ"klIjc//W%!8'IG$/T6GJk0:u6h2.B&fbΠ%yN;+&1spjwI ,g!3Z=d(, ~`l  ``6`Jݨh?,{0D{&nڑYhcc)l: 0=rV l`7 F-uX.pC&hI"aəhQ +9ҧV9nKT*8^Ajǯz=漏<8c+jkbЉ.[Q`Bء/3]G2 :>`ltާ34lhGHآ^i[-H߽wٻpi ܭfOop5]XsQxkm0W.޺jKfH=Ѷ{= gBZVՑV*/3i2 Tƃ1t66* DyTO¼!D߯-WIe;29ktIDWLy!*J!I?~Nrv Ƞ놂RIJRd\\g* yil"C?h PND- ORv8/G+MIK0EgOy`ej iL)[Cshw%[89acޥ,5FyȠeccMơ1?~F<`L2,YNK'8mHNbfN;B*7TaO!E0TsCB!R̅Kf:2K]eY xW>i^ٶ}؆mv ݚ<_Nfc6Y~<;-( /' l L gI3PD;v&K2]أ*8yۘnt*z]CD yn*j$T9Q(Y/⬯@ǣ4wEg,}m[kuE®8XcYs ^_&G)+X޺h} MxTu240`9,=2olKMI'σ`6Pd$rI yv&B4BZJPAZea ;$q)2K9@!x.ɥqg5r6gkU[G#`w/~|z OocL,3֑_x5yS]\Y|]3bxŴ3t..b7%r%iQmTus Z=usxU nhtet3zzZdͤ1s)Wf;!lhd3ͧ]f~y f1r5?{Ngބ, k&.ޢ MzEn’5iDwWc>I9lvhM6&\msf&Ĭ* O57AqxpBc@B9Q6Lɾ6?\NtR-,~d6gsl}L?mgp\|qk|7OбKAaRPH-%yP_ݥ&OΠ~4DﭏIΕf,iͭc3nDHԵCK=;ԩ#%3N~\PdW ,п/JEe 1X+9ў1ʎjlP}ьi ]'>C'[CjC'g=㽥O+sAOK*l$tF5̌(9E_85i i}gnh&C`Vyl6ZeRv%RYzIs SLJfFz͸<.BUYʃ.|R]xizYg7O&q~4~0}'ӯ\c[" 7$* 2,X! :稄6:E$[3*H[_,dIP?6.L&sYt(3W(}lZa4L7}ոc[+km}C8#L%#́$!󍣋ƴ)!`* *QUf*.Ɉ$d%YАIȢ(lIG>dT˫^F}ij?',q_4b5V#4A#ܤ Ny`R kc0 oO{]͍ jDg9cHQI.k!ZgB sH2kwFzx缰>:qɖzxezW: "% 9I"c`ܿػFn$U L{~In\vlvqa>1HL~-5RKjXMd=~E|t"F$铱_#آ=m2&x^ro0~,,Rq][ `gb5/\Կߡ57rwSqÇV#%G&a?r ={Qk9,P墦Lt<T 9s_5`7S0!A6?4vs:mn;;]1gUԺauy?z3j `P{t{Sgr6svrѪ{C~8'u0F )y,41 \셭m7TN>_z$mB0?:Ӄ/~`~j0<>B*>wL)'GC\^ȼmtFnkg[en~VmǯmU|}6 sF`H0sɕ\2`J+<R:G5Sg2Rrbۮ2רߜAW\kE]!ܰTʢ^ҊjH]!QW\~6*S+tUr.+焮2Jju. e]]e*KTWVbI]!۳QW\q6`VL)u+M/5܆ gΟ~yw>ì0(@tYzwoZiwd)B.h&df_z~0|6R]n\}"_(oq\|?{JyIl9P-UJ(YE6.uJRǪԱ*uJRǪԱ*]KRǪԱ*u.,uJRǪԱ*u -uJRǪԱ*Ա*ZJRǪRǪfXD6Æ~JMR:ln(P/L x8$Jc"HR (PJ$*ˣ,h  Zx<\<HkW<-=weupjCTqO)3cypH@> *MQ٠U'ѱ$4b!D{ZI9J^-(4a)^ /Ҝ"v" uhkRεɞޢ}MHLmЦljK=/|? : g[_(;j[+ `ѻJ\k2W^q)hJ#;K`:RPHnc[M̵Iy#\-%(D@1P~Eb!?#Gq'CrW1 {2#q8SPdd 3H`4&8 <C.萔E<9k}8dW K}lX&UݤE|h{]qHH8Y@B 69Qd1_C&/-g-`[l!/6dK2[$Zײ,`\@ wCiN/%u>h6*b"!%Paƿ8*QBء , nuN6Ηu./h"(ymZ w?zz/+eeeΚ;|0nO[;GqU> OlLJKGn2*Pmj!~R l= ;! DSZ%,hPELe"^%Iɴ֍O&Q)ML \G(K$jx,)W wAQ?|H5p:Ɍ. )cUu=ܺ^=NgoyWf!, ݮޯz~li=/\[M1o[Uz"7t<]wS|ikRYs6Yml=-9'z\{n7?oδ4`7ǽTxrL&ni'ATYgL%*F}[g ɀwwy^n#k?%D3`Z'y@L*tIm{Ujf)IQ)r=с6[&`׽NLftY!I%k kNM rY+$0~kf 'gcAG'BGZen>uq^`XnO4N*f #γ$U ^DXHA"5QT{Bƒ2Y!cg}Tg\ڨI2"()w kvrHk^3[5X#+8 $",DG2qZ+ET|׷ar4j|}H,NpPPy/\jBs Ӻ ۢ ۊ x> 7L#,74Z22lN R3)倉}k<S1g*Q@4֨ } F9E QtiئbwlcU`ƳzS̘U3`xکϚ&/˧ z"@6U7,(Olv+SMQH"VFĜ0-t.c WT>piԚ:\h8s *V}Q[rtb2VzmD^m-GwEVEA@:lNs)R"rRF{υf;B|UޏBm>:\EpS$OAhO E2+4RYP>ZAKf|u>amf6mRN2Vփ'`M)8O@µc0ohZ? zc+*e#jcP6KX5KJ:*AheT* ""H .e4إr_s[$E (<&C!d,Xa$pfT$Drϗ bRnY8H^I?B矽]$3G;'[-L=ż#r NK 5IOi]ea \?_=Q矓.r=_+ϑljYr&(Ԡ37ˋ@\^k\!åw.GcTwе_\@p9+ xĪ+܆Әɏ7S>g羋;Ӄ/NOZLg g̫G7Jr˰U+|熦1 +1Vܠ G(Us:O4mݙ4یg]ɓ}?~"8{<߹o{{{}3[nvaYq]כOnz!7 V~$+_, Fk3g1[~0;on'Wo r[WՠʞuO4`&#wm% @h`.ˇԖ @%e__$/@$k9CLLwTTc&; ,7Qd_O|6NCۼd.^\e€Nhbb6>I!= -_Ē\7]ą8ey_j]OΚ.6=z ^y\HY<1!\1h>H42RETɹ>PzܻtslϰI[.᳆MQbNFF ɕʎZDL&2RjF9brNq|y(=ġuޢ ݶ;GMK~~0!%DM)]4ciޔJ9t)tFCY|iVktw5SXnA;/OP*}vu*tV Wz(ճ헍iRN$IJsS1E0&%*hcmp)IN6Q.QHc.eR/ (ڀ&IL j%RkfOO ,|% ǃ]vd8{\w|ecw|-b"%+* sKd_ bZ@!ek"_ꍷ aМd jxʚ3x..;(N7r7lyG6ӏibs&j}:dj,ut&(b*bv}2rKxMo![ƻ$  ˘QeISf.1Qr i xdG~byƽrnsҝ쮽πu YwwJ&{3|l, &h?]|i[|'`Du^[)C#QzD#Q8pQ8ߍs[y4SG گb, eTr 9*XTZJTJ(}['!/79E].RrZ"*h,٣$\EƉflX2!uoI(T̊,N6NiFjr\g¬-Ȁ%Hֹ$vِ |%JBT(d=NH U0DjLq%bAbߐ(PN5&kH{&^ɧ8pKR]w;/6ž]yV$( )TЄ` )8 OB@H9PP;Fg8HYzz63وux&n<mŅcv߭;mB_=_.qM˷NWO<5^}o&E#{bq iUH$V#$eT\ȾljMp5[lk@ a<{J0`ٜ9Jb",6Hf @%p6Qì{W50A=7Ӎn:jƒUKN z jkO\P#"O[T9uךc!:yUJzrWHNwǠ} 'o~)yRr:/:-ta; ,:uQukA0(|uv>m\9ٓqP~CSI-O'CuQ򦕴kn v74<#u c'ZnNhxq2ouKoXx BwFHۑ ck5酳;MZuZC5;ن^=mS({U٬̈vVg}Sw #L.u3oc9rmEMu{D{Mۂ:mrAze=}^\ ,I=򕍡ۛmojKxyf'Ss]p/97[O'S5Yt?sCdasb{"ߞsV5os7~+^E|.ޝ[{Ѭyckw xΖ?+y$E!ZbT1BIi}REQUo=?y]$囒/Q=d~;?YɶOE)F,4`dU_R7^z'{^ٺS7MG>b8>]o-9OEvΑ]O5*zJcHS,Z'MIJudim Ƞضguߘa6cuJi >m=['tgD]|qb`ǏcQG'$iPTEEABdKg‘dLm钗DKi2deoRpe,.)WrJd!Q$Bc n&z ~g; WeIy{huGk rE\G{,iY~>çtn9 EutZ҅@KCG}G9FƒCRQ @b1%, ,$TlAc'9lMQNϙ):3@DŁB2qg3q6%Dѷ2,8a ]~[;[ڒ;7p 딸vº:c]3fȳivYx㡇W1g䵏>ҵĶ\)ڔJ] ~rY IbUu6<9)(` k!QE ^1(۶%|}FQ|CpYT)v 8%NQ8L&?&UɌb~8؛ɱs72˲}Hz`bl&XEU_y-|V\N3'GbדK _dr耙:u(Z{fQ^|a3ƾPp}"oҷWk&p qm|OO{lJ:*jHq)H-;gKIFYDADEz8Ec_ % FM`dk=|QgmIu=\+q{8Bpy(^vlmcm{#U>+t7C`bt% $1,zDΈ٨):#5 (:AF5E*+,GQAT Auz{w3qIP?珫PRȉEIT.&#_#Fb%|w6ӎmarg.nX?}v!Cu!{oX[{,{a9UJw}{mDT[A[cJhoe};c1Rq,7 p;ZuE<(!U'IscѲ1r\i!((C+#&`%ZEaRRb4:`$2^i&;/هL7ي3kazUгYU _2\bEc=YT\X'IfV`b f 0NVhe!&+$BIzmfjdĂ{;;UlA*: P )Gt5 x^DQKHP [A@Y/$+rNE+\FlJ+.gt "xhlg&v 0?RT>T#E#$EyL.gL VvkͅW!ܛ@K~)?s=ٻ6$Wy`K#nv0~1<%"EYVodI(,J]yEdd aV UkE"SqZNzj,!74:Ug$dI`Ӿ5,!a#yb$JF3I:G0֐Np"88JCk4Wê'8E{ANC;K \c2aNq[#N h)k9鸭=v'si595ۅ=H ([t.zĩy )z\mOЎ=c; QȾA[-:AoX4G+E"SO8.m D$tqMCl?!&1Fʖ ,2!OHK.BM=+]oXG'òٽYdH)'yuĤR:3#d<gy`;enF985UGY4._5Ȯ.Vr,WL" [z$ócf9>)bϸ=Ooe_ҋޔ@HzVo\2`NX$$h04kmuN˚~ha2 cJT3( v,7bOKU7wbLi |~j6豭6obi ݴb[ejms,P'n v޷=}Hx0uy)xr;ogQC̎AHvHjߢNi%ݞ$ۅUaQO" 0(u ΠodX g+H¯NztC||}|" цф2ӡ}E5 exk\*"g\ @Eh˹sa=76pbP YgSG'LIExbmnk8O:ץ):~8/*^8O+\7oHܛ`|%&7Ĩk";"VjhZ 5Ukf\]G)T"6A/[$OQv9کL\Ts*ɇcEO C4*95,ki0Q4rR`J=4ms.ԡ:C!fGnJ#AvvݶCT.d71f4KYxx{0|OY/Ds$gǟ{;UXO6~%mU3 X1JRȣVW6|_?Z1 DHhD@lQhKiTJAf&&$ڃD1 I RZɣ4ct#TA2QQ" `L(IaőQ@ s,P6q>"IϛQNjY!'^yiWO,4!R$AXg@x  *꘷条$ |64!/+OX[0xcv+7JpNfs!`CgsG]c+hvX,^'׽۲e3rO#a}lN1or~vߞLVFb %&1#54rg<bi&RpcB(Ԛ SSz-M[ u\jyj)b/gAwױM2P'Xu9V?\>i{[ޕ ybcۀ <|Wy:i3 ľLe0XyrI/9›-2AwPB UmfJVꓯږGi Vm#j?KC[RJ殍6E4@qx &B6M$u ck8>z0HSPϓg^s-6.$Rz:VJ䦊'ʐ$ix@$SBhcwdHfEy(Q'~-۴6mj*; T_ktJr{cwFZm7Ghlj}7V ^$8T"!`{ I+@,'JD)sK)dN"E"dZ>z{  ߌ` h^1RPց(@ (9l"pEq҅xQ: YgDr+DzzLë\g>|c{3HbYJVG |JMC`,VRz \.lRDN$xuDO$&yM5Nj1ju4&PNy&R&yD;ӊpA1Hpx\Ojcr\6-$4pxNk "eHEk ׷uZ4 Sz(j۲뻧/ [Km6W.ƹRH9&<,jT4RyA)huIvb-1vFqi%s SI .eRyL'n h0r&[6W9L0?Xm3*x!+wk6,쭧{DcN؍yg-戸jJ&``t4&M;%8r4^|y9k'$V8{\ѐd)c 5WJ2! @ckM[Չk\%^ ߀9~ҳ~ٟ7SdhSI[ DRkoeTS8\ IUTq헆 C^@*NnGJ+.mܪ%>((fT_m-{$-“UDmOf(ktO_@yUj'W-|+鿙T sH "@[w .m=wNp7}֩.z0QOox= ae[ 7Nq&%c ZgAX5;(,sxt WQs .pԎseLJeh h6a,d z`us\ЧX"$ΗhPm y塇T\pR'J#ꚂE-'5ZQ4"!J.2(O'1cMP9w S _Yw!,QKis<uK`\t$4.1"͜MTlw*_ݗSM/va$)D䍠 ^/aIP.y)Ih@r*POT2BH2>NP'46)k "%I=/@δ !$ ʄ *.i7t %Ex]ў\veUwu8$%|`O ׋샩zcI{JOGև |UDpF59EXFJQlU7?&(og︁RV'dl$F:<*y5b/kmyu(<|J}hsA+];{#[ *~7m^-.. T|;X|wz ijR[רBJՔcoeN{׳u>~zwl`"|c4_]Oז]r!?N\mXA,Bzb'{i5w#Hc71rX(\Y|>7ɶɉ1$zmFGq#co+_=ӳGrLJCҨDmX>Ͽw?>??\R/?~Zϸ~YCApu_~~Fצy\]3g c2ekò:&d?~}~W>G+%S|󠽦b>eGpn7Ӝ{gM4"mnDjQE&"D#4ӸtmoV1F}㆒rA{' ՕaElN<ĸJИ4"%i)浶J/qKf?NvW܁H,N73L[ @ hmFM1AwIE\F{E\qaZ*70M:0mU] 01M>ZзW%.qWZS:lBHࡘGBl s V'0(Z W̊hX:%s%^=e #EњRO$PKwQß3&H|2BTdv]a #58rHmW w+KFWNnUz%H U%ZLU<\ o*Q-?DyU%I*Rӈz{-~Y'oˌU꣏3\̮1;>qEb4e,&Qrz96'N99-#sW2}h̜ݿo"_/H䓡jbkLo|-/A*s wmI O #!8l$dܭ!S"e,_̐")P$1` k~U]-ЌROA #UZ9ŭRd0IML6a%' ш*Z Ґ6 XpsPjBom:J,1*ni-0wbZ@sF#RЉ`FA cQЉZ]A'*s [(h)"U"XW\%E\CWJB[q +gL#W@0ǃ#ZAءD%kշ(y-xZw=S}OoyŜkvc~QO`y_'A>/Us4p--WV~V] E0X{둋~zR-_C V l sY/_>'S(RW7]XIӸ7 zg/?|؂TB ߱~WX HԔD b FрG@1kUʫC+jK[gkw*^LP5%a+&9NxQ >^,Ivo3`Tv)L"@ۚ02aS6vcF&"#JHJziJ۴o0"YpԌ9[i~;m"`mP9  }i#9@Ժ<3Q iٱN 3.XgV-ҏ%T8 M`hKI  bi1g;Jaظä#q dExv+b'¹TW櫧v ɧxG{/R7bb=1 ύ5QhHƢ㌐rv6y+cd{ToS-ogP?fm1{rŦ{^. 2#`a|,$\b{GOu>߄iV4'?[8{[.w^qW:8 py"NjT3X/]9 c,t[:e^S<`s-:2kẈRy6s.ۉ3tNҷl'vN>1Oy~f>|ԧ `f[5t+Xe##YqDgԓ uD2!V0lbQJlBv@2 ǨU1 t*K)µ%o8z?Dnj/%jo.>/=1tG52[f䑫x&q:O7L;B40Fii,KHn c%S{3Lj긱}>*f^~8O>aCJ{|A޻ᲖLf秾g&/SˮM!HY}1z^!W40$e9!_jB 5tGP!E!1*;Uwg+o /oF^MP|bs[vQfT_{& h{p$"jGJe_tU7 a=lDzBF YLލ'zx?f3ZrZ>%F}W%HŨt{$ ́}!ȯ%>Y\wcMr tH(ђ"Mm}9@4G)b`$i Z{1[,lidN#[y HR3MRR$O:ʯr.-66X%.Ιg$¦$ ZAǎZН!m}_.$:GH܄<]#omn98n 9YpѪ:ˬ_@^Ʊ֋87NptW)rē&ert4?u\ZȎ뀶ՖwdǧǕNΉ۪ԬAe5LB93xwoҟz#vBY^8OǁbtjN'\Nk6x oX^WP04,*zV/DjvְyO'_ 1&bsK7R& IYT,=,G1?7O >07yNð Ilt]$|* eB#B1.RvCUjUGYelua?Zʒ'iTiUs<}),<9)RQJ㭣`J B9xa2%dӥ8 JFQ|Mq7-SPI&ґpNK=GDRKqs٤^,Mn.S ^] rHݭʣ*)*uM [WD22ֽlʣ]WtX%p&$rz( keWHJ S 52v&ȸKy,L3B1  m̮Jr;WO?N"? |:QQIG@MVL8S#l)@DY5Q6IA5 _J@ TM`|g,LEiOے$ŮĹx塠vgڱ/jێQ{[L!I01$L$ޘu6zD$gD@2S<,bHDg(55DKG( \LI5a:3qÖ/{W`<DL?EDG27#"&4褌B[\} ʊ;0ȠGJDD¢fnLFŢ%gb&ȥLZiR'SGw&݈ PG2:;Ӓ=qdǸ{\qqƋb7@嬁@. cQf cfflagڱ/v{<u ~G-*|pH`ሎaъ9ı:Ȟ*%Ȟ'xdHEЫiXSn[Uu_`jtc\H԰e@B8qZ*m>]·A墍1L(1La45jTO y' 1}Gt+d9D<ӧJonf٫2f-Џ>c8%AH~(U]{:=-bA=aZ }qJ?| D@lt!VbCav5GMB8Y` hB8hB8=Jׇp>N#Q;yDp*UUOl-YyL 2u^wUEy,pҲWUJ \a#*GWU\4WUZK)a?9+Qb&xT@IJ0?~;xr Y*|dhN+`__0EDȍWD6V7.(H hb 83g}}.n:cfc_PjkwKÎ02lqmpo{9(^gq0)]s-GeJZG1ףV[:R:m{|ŷ,6UN?]OΉ۪Jk;#m;F .F*q < 8rrtjkv^_.x֬ݰLmai[ mig;7%9Q-ܳ>_JZ {j}5Jc,0OGWU\w4pZzzpA[Jmq>_˅rn3~?Z;V}260h֩E5]'lk/O gm7:kx]"!^՟zQ{|f`p`*[CP2kPIRJY`r!#b)nU}*Yg64@\Q%P[d2FWQխy]hrewKt./$uNƟlHe V_ҞZv^$jYбɋpǾepMU΍ӹl)L-^i3^״.Qn H㙷GOVDpYc Z9NkEFG<:l&ecSzZT-l5=ԴSi$5md41z/|*uJ'Ea j!S`Y 7n ic7iC@-N*';{}{O??߿9/~>9e_ޞ :Cᤱ f.sPG4oݢi[ilo47b/S16^ekòk\-ȜzM?|,H=!'m_:;q6t;iMIɉh>ip9o7\ʼn*r݈@Ӿpٸy y&Fz&IynJ WCIhWd)"|_.>b?_ %}&^|su,yf%wUeoOw@&@,5h)m)3Q*n2U]w3'ۮR\0KmԚ3LIjO8KNPXbYQ8S|asr&_'qP;ZMY`'ڬe)m]C$@cTDy?pm$J\"S}t$L@~U";56_"훾Mpb5-ڽY->ؘYogڎ.ԈѠ3˦W|4MP@&mu?L t*R%VI`^i q(X/+c/Jhi .Ct2Qg)$tJȉϩdU[k4RkcF{`3J[WOZiǶN~dzXަö6X[LTO!Sa/%I%[@q)Q13׋&2CRgsʩwS#{zfꅍAS 4E 1T*0A)R͈SVF$)mqbh&ZZdFpȦ9T*g8PDTq.[x#Xw6 cz](Ѯ(qWt_8Byʭfן9vt1c}i=hzt"#aqnNadKZF5,%QZD8"dY1l­!4s>E Ɂ:hIEDȃe !:Tc'i3ӾEi8xŧ9L;B&.7K5lue6V)\dMDDHIihFgGz$"8Ej]Dx `8-4BO< y9_Y3$%%EKGi cHE"ȓ ,ۍ'qⵞjЎNzY )*zE!8 U\jeNg#L}DJ_+CjѠ!uڊ~}`-:Y/BOh>8JIՉp`2"$#N2"+˸Q )%.ip-Gg9k18DidJs:q}FcwiZw(ߧ+fxy.LEG=. ׎5u>,s7 {;U`Z]ʗnUa֋<6{MeX&nh"&fTW @ݭhhF8 Ul-;eɝ(Z~+\㚄o^xcMK3VyιD^2 X9U,eDZ+&%QZ)r<Oٹy}皅6 P|si IPL$|&/N&o62_qނ>%0==?t?{hS/LaC#C)y=i㹞)RIķ3V=ׇJ lU³Q/eĨJ(m+rȁ^ 1:PűxR)؞/cFULD\9&߹*df#J K"6Gds;a.b+qv4ֳ F;˭1|B浉Ae< 7GcMvOw7MY+Phý~T.vSW-<+JRv?I:Qd$Y%5VB^yEAvDJJŽ`}]IIj$]U_קb zAyo@huh9P<Jgj|}MVru3sM>`6tpθTZe)BRXls@\U6 El_R$@&wIo 47qɍĄ6lWQVX`}9Eݱ~ ]}][L296}XZ_Һx4톝_^QT~򭑞GHw,_7U362JX,Csl2:Ř'9Y2&&NB1YϢhRNCTTq)\B*źWgb+X(Z,hRObOY-P"uYAk("ZC QZm RpjҔGCqG{nTs0!%̈́NS\Xw6sp=h*}L{ŤdG\0.[\\aHX' 4 D=1H'}DElcbұ+>;4ȵ/#&.G^q;3C "C_nxWW~~<}ᄁÓZqV*jY@P)㢯OH*^A !^;2o7wԡD}. vyFO_H{޼jR;eKkYYaIeuPktq:[Lǻ'ӁͧDr?Q.̧?fգ{=#&6o;W?Jv#:7 ` "p6.Y灼|7i?SXlI!v*Lb7+Cт|ތ_:E:!.[N^;^FNwr-m>O JE7`I=Ѝ",\#f+Hn H4aH&UGU$W XЪٟpLcvvghswK}^qL5xIr˙<8H@ |tC$:FKL&:h-EO9J^-ZiR^89E˘҇9ź&bDd0t$w*\S֛j^*f}L? dĈryEyQWѡQ)C XT`6"XP8KASTYӑ8ܣ5VN֨A!1:#/s 6%㺨9xb<$xζjOv$9gdFxI0 hm W!$ &m{|zV;YW5%\!6oK-5$!#I:0\P!rC4,spZ^n_Oz~fwBT$JPpIh|E#) |@Ӫ5Q0vRk'͕`wVhGU2\#ybDJ)VIAyjH@Β8Q .&;=_jh"1It$9Tmr!4him܎fUN}41Dؔo?yuDl*JFMzSq+mH}+03 6)iZMɤe{CGKJfU_HJ P0\Rlm,Šc zLBX;ķ:/'R5Y=Su<H,EBJwT: ,MU]߳5Jx5I]vmpKjHORԤ X ux׋ǟ*%aÆ+3Oa]p6Js/A*ߐ^t#3hd.#5Y g}uo>̲Y޼'w}QQ3Y}g* 7}Pg-rY.&5Ѱgfaub4O]Z\Rknx:3hF'yIff=vF$ɳ.%͟oq .٘C? fyV]bstiiydƁz4<t]?xN.f^ N_M?<Ƨ1%zݢT$pfǡb.q΢1Z|(="D Ǯ?EDatH 4.Y4Z`(;DC1T&>,-:@L I1)TMDK2lZ0B6E v~>6H= Rjq~z[`gZL<;c=Ol=rV8#-M).yX7Dj"&<, /o0$:D!}6HVgBR#cT!W<*L= Ac< Wg 5xl$vJ'Φ$;ܱ\/@̐zVCB؛ 9wïNr ʜOL:d`+c;O[hJvUih펄_8aonVݮչ,m> eJlKwB@/l>sJc A(de/>XVM K-x%ys :~̴C2?3mCsqsz}!iٱ^ILm &>hL)́&Buڥ_Ws s{xߩdBzB&:q(tq>*oc1 #$ ܚ_yѳ%890r0=0#o[ )'L&A7S8-r㹰2$l.>eͩ͑oH *o-_(bVH4.&k5튚& !8A DHh8 iJn5/6$EcpZ( "  EQtl֤̣Yh8/|HE'u g=/y2~㋾XfE-,`7FnJE:ZmP'DHb$ɷ' 2X+@=*V + YZjgB ѩMX.FPr$>j8-yd2s}zLJ E`7I2%8GE"`1$c)dE0@bĐA=Λb{L:~`}_UMMIpCiky#j Xl0 dQgR7%S l3df}GJSto\FLI}+I` tI"*49%`F[\O^e ɨV)AJvUq c. ,) A YahD679Kkr`_. |ͻ:ЋȧV^~JKQ!Pѽ^ ^^Uy]ⲣQ9"^fbm`DIƒ8,mޗh`fC 콴4([ }T^<`] r.#bW  Ń rAS5rQ",):cBYg^ ԬԫMJCvr5P!dL6}B${tMmNIɤgޛ<@KZnw'Lȥ$L* 3(<51*.@OCk(rZ AR;o.8icd<*9\(GB<)ZQ EBp<~G>&co>_a3'ه6\wގwz]/!H^ҀK<)W TvSAip{f{"Bcԃy,| 9䘹~J_1 &]+kIsMY“Bס@H]C7_ytW:) yTw-w{ǣjW-y;n4zkZ o;u[?ۛW5 ,ilR эSPm)_lv@Z'BZ`H+a5EbI>kFkB瘋Y&}ɺ շ`*EJ`0X)-}^qABT@j]1mcJA( h. G@/Z[,bQUR:JE],2KiPdk2V,4{`TL쀗޵6r#"˞l{x+^Cd$,uh"T(ϽbK'V`Rd׮56O3޹Ҟ>2 6w{$)c[غn]L[o?Yv- l9K{Eύw^]BfO7u T[ki{Vo]EG82 Ts֜Mtאbb)u ~ǥg5WT-l~ݻA }|p;IMD) QTu!Em ' QqZd{ tZ{̚O Q,28Q*E<o3 %ίD(O-vp3]?]HD6Z'bS\O;y8 _!tgplљБQb/Qw83.l% ]((Y0y!,q$o'=q´HNԩ!:$"W FFQ! NpqAPq0A@!h"(iC ZBZ;/șl0ɝ ?[sbU@NPDwOFߦF`)ubЅ [$O0J㬠;G0/{\ p,)$(<[= 4 0-MNA1Q)UC+&L4Xh17hdh(Q'ZZaFb<5]8)rvM%ynYsz0!$^x3d'zv OZYuX{| `cg ū«LhIQ*'ergl Ro9[-n:PGvkgЏu.dz753WW_4~y >R8=Ջy-K=G<|_U; n4EcT yKNJ8jPrZeݖ! A{!"Pg3\2O^(=Gh*HF+Jhq#D!%vW dqzQP zڬgA Gf\J 쩟qёhM[OZ͜QY((oekSM;TDA ^;6 5_P.y)I@r*Sd"Fjv#S݁.u8bML_C$;/Fh@δ !Hv}tc!tH~֋=Pvce_Ig)"J欿ύf];e=>%y? k1} 7~V8uS71_#$>?Cs4ug<\?N>.5pE^n<킡Y㤴b/X;# X?C0~ia܈!̫OAm1ɦAx5bQfH]hy8)h P.r"ZP4P*T`VkbBR# }HHiuA8͘oPNH\ Mz $Xf2dVtvI*:of\Xk=ntJ`GQ FG 9HČBXC#w&qO(!T7F+@;r-ǽr}3i;+WDZбXz~)j'| R#BRTڄiH2zk]!E dt EtG/*H7?aW<֠[ i 6yk!hB!93#\(0O!I*le=39΁[JTjOlnqZtn3wsDOM%nϸtW#QKis% ("a919>Qh\qRb 7@+"2/敾`x7p{ŬKAY'DpJHـ#po,nC(N;FL.VW|ÔOcDn4]+l굆"|ZTj:8"]§$Q5 ? R-!$Um~,7,$T#sޱY6ǛS֣InYVZmgy-az̾)qE=m@3Q Xvji .LDŗ8iևsWk|PI6q{`.hHT1+%Q :fѴ88{M525pAey6.q;LHںdHd'H.4ୌpgkETd\[wmqQ㍽=/;yWLe&^x5^+Rzs)IaT\l?J̖R_o7%hYL!`j9ЋEBr/mcc/T s|a]޸q(g6Rk.Al$jeYl#pA?Fv Yq̃`F+ks6&4zs9/O95ѡծݪ2Xf~GF^rYoiu3 WTX) SX)+Vjs酕BX[XVBiîH\!ՈL2"d!gꅊLzʫ?[ e @ܦ2󵋘gh~6ܛ : Щö&OD޹qk,%6~ ·W$^5];$M#UqILuwu[=ATu<ѽޖ0ݰ.Ct~yX.wx# zc)XԴ͏hEFIaqiTqpVqg =׭ōG\S^;1=[f^oFj#uؓUp@ب[G}PT1/HV1ed4jpkOq>39lבwc$u[R&P&]5YYU.GMfdq+2iBޚ^J;  k{e{urlwW[Ap0P伶o5ɩd"j4-a"d> dLj#mTjcN דjQ唍 MB"UPVYrTA[t/K[öU`J#֚P)*Uε`04IfԽs:EʍZ :Fm~#!T?R֖v[!nLVg()ۼ!ݤ&DsaO5:ɌaJ9 o춸ѹk3SD pzG x3hԯt#MáuJhCذkOot( oR!TR2玡Bi&xB. YeuK>fg+Z'A裃"0^RFYZ~߮.NSi,HUFK{q#X /ψ1'B&4A!x_O͹9jYU-wTM%s!$+F[t!(sRB[Pc[%c2(mGZdjڤТ-)$qu$~JaE1k#JK|S+\.jH)Fj/ <$apZjS6z\"FQͨOI=i^,>YЇF.\>E=uk,R`榤1KuX H+( I׆Rc#"L3R/3ju>c5.D)5>#J@Z\AAN:2]:Xi65lDC4GRvt_ \I{H( ;:I  ~j2 R+a4tYh (be+A32-(^5 5$DjeT"ZE(VS eat aUh#ǻ=sNƅ །AY`M/f"ҊY7ICQŘ͋BIJB /ڄ{aV9 t&>lݼH;ϛZp v`c{{.Dz2Ms$ >:%Fi:Ҫd :!JrJ2Ba1Y!'qcGe&5%t_8g$M RI"i Yk2a(qcMhx }Y%H Vx$ۑ o>cQI,TW|C"|ΊfG6T'W~cU0~Hv?<ݶb윾3di"YrB,GKе 2"(Tʚ.O96H!/&sP6DD —9t IgmG]I@(ʠv14K q[O κ$!H;V@@` )-f1c &$˽e;VE̓pDhQ:fƮ6 3iHQ-fWbd@!JP;8*jVXT*L*BȲ$P>fl~)DknEJ s 01V =liVYI$e(mJMKKުh{9 i ߤn$BG lӨ;Q:hIYT&ZJm0&ǍܣW7v.'j7|xL{y8o2IVcUn=CnNN"ih$timl% f oWk'QEŨն[SQ5(U/ yHY=yh4vM Ƥ =5>6"|IeFI*väDy آsE6G=T˓!Ww[^O:)Jj]d*DeP,(QQ*1#K[=x"2PAz{zT [ƮXD jV$⤩MkurSp͍r7XIy/aèIP(EeQAa6:XLÎg=zu7ϤkP\GJFdU8FтΨ19MAk.YqFO -jITAlRԞKfݼLFALPha3\BvµuZ"Oz*DEK+8у7[E05^% X[TqcNZS0²Mf@ beR\H O34%7aFD5>d8N֞S6&TS.*w3,H&bC1 0&ʥPdCJ,=uR@Ւ0[`4b/(m*?u+<"B.8A+ zm/{qbqWڔZqF#ӼnM5} ח/w~QeC9-;"o.Avz_y=]ނ[^}w.w^_?h@.RY'ؕ=ݐs7&loa k7 zmZ9EnG/O~n۞Vê$.[A mGڜ8yU@%iJxv֩?_~N}-GtO*||mfgRwPˀlq}Zp5{nvǰ7 Z"_+W; 9:>G@C -֛&7\ rw6h?o>?+?e"W_ӿʳ?Vxp<;"Pr2nz = =?f/lvRϢ(1:ᐆW˟fg/Vcg={Ѯ^^襝v{1צjvDgfO 6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM6eڔiSM'M V <bh Ú[]Z+aVȁ4trHp,=ч\oSFA±h)\1L )%\Wdp!G]`L|t{"+Up WCJ]tj8l`99?EZ^[#uZEY[>o-9sVg`q.sX83WfB U8p7VJU# M3/f}/[/HӣǴh{]ΧC BXjVO;+7mtuX)"ֻ Ǔhɷa̍b `DAMs>;7?Go}!'O>OW uO|JPY0!lmn"lk]6"ffIn^~{?^C~g*iz}R7럭[ \|Ǟ/w`׋t}7X$_xޠܿ^^?2]7W߼ 흧'`d(TYo}U)i(.r`u.ZTR;o.1;}Ӹ1Z-w]X *;C߈$mug/w[?k:0~?]fp_V֮΍y"r3+uM+n_̩Kݩ`&KoJ>{]VD]yIi{˛]sR}5a|1/ٻ6+W$Y_ 8;2`E cbJ|#eFdi*G!|c,[;rW?MYr3ٯ޲:aI<8ҫ;-O(ҔTk%J6&T>{*;2 S~Vb,0`ft;n|ÊӥǥWivKG >S]aKmm}ջz4h x6 ҪB'G k>ŝ{p.־sN6`;U^$ذtX~,t+oOIhB*;^mخ AzC6~M-|ۀ~mXb%e=RrnO 6m{1tћ^ir!T_Mk, J8$5Bj#K)!4rvBK*ZRBP:#(9*n76΀'}zp rcFСFQF$-Yax؃3C6"ULpc=Z#*iI!rM$x'Vy$)YL:WS  >U4td$f ipTrkuHwj zL]Ivۇ] +E=x_CVy&4e1?挒1#cm(0DO&3=s+Dl?͋w.Ը*L7/H-g'^Sz-3?jlݲZG[{phc{N L_kpYo]=ZZkgk`4Cy½a`$ԟ9H#Nc@4:&𻓩33,T]+!H&v[bqcIP@e#cL=K 3k K ȴ qHh xw+b5 ҘLʎ_g#Mu4a$0 ^r띀wcjN"/\tTaAX0BS uX`Ha,LXs/ѱ>*8N  ^flzc?] .QB pMMo=¼7+mU!iICMָthAw3ZSJiQbyXMˣ}=R&{u0}]`@#^"tq% a.lnb6!xs- ʊN H ˥/l^n\2p_J'K0WM}l|=ұpy{S z`g_=^)޽L.W?*hJf$p)oz4!Zy-}8BګϽgU'4\]vޯ? 6,J7`~T?튡Hm,OoFџBM38gi4OQ4 чѸQU(~ѳ97GY4k*YgSI>?0W#⁛-ۏWl &cd36iY[!DK4aw\F 9ѕ_0v lp6HF_jRz\TknTd]/#{e58|u[9F:o=J;HB pFx3 (@]Y26X%..g¦y$ 9މQ~(;x؉޳ĸP#bIk&`"|\ˑ`9rĉ[k U: ٯ{a8M7՞@z\kKpT h6u88瑌V*Djh -`Y=,nϪQ`lTV=S-PvpzKJ ֮¼5ɑL`RǔLm6>d0;˜ַHHo^@<``;uG`6N_` !- J/$Ħvlj'W9*hZZ^*eRr\%t`:jw~pЎpZJOw~^!nЀN ~Rʠ7XS㔉o&x92J#"( -I\$J!RHDcnb6!.$9ղ>"q#^snb;S_ªxȺBOG1_brP~yc\|97Qq6bBr f+GpX] QAN'Oymd(*h c-&$͂%1`ЀLyDVD#,$07r L!([{[m҃7x7vޯi ϏnA!G&qb(+KA^  =4IKۚịUh4BkM̨3hQ9b0XJBSELfζg>ؔ࿼jwh)Jr^i/YJ$ίYeWnc4GS&R p,pK Zj:ͱ;L5G%TIl RJ<PԚȈ:IVp@XfnA'C0!B*Caj3jXװ=6@!5[!-qg+|z_&6|CiR&W]fuH%h\GY5Ijp@)JQ7_> HB9UaXR"23jfp5- iPx<TxP YxK^URO-"%[ꉈr7"IR}b DOpMEмyB})nP>RL@GbW1 ^`A p'3Tnɘm9%c>MfCe!, Y' O* לۄjU[f3W`?{F/{퐬-<̞`gvfg#D='OuX#Ӷt8լ&_r/ndr|>/\c;**ieYU)dFΙRVƚDYIuP{!+1*PUm'SU>¶p(ޔDE$Z"vy݌;vڦ6%N=,f:$]! o|b4fɅDKN`IhcCNKd#De]I$Hd8R,BE&Α Ks>UmΪE#6]5mx!E1Ym7i\JlI+ĤbjR9kKY/Vl%;EohS TBxdsQ>@)eAQѱ8`1fFPz!fܱ>l|k7Pa Q1ʾ)n~`mƘя, l pPf갸yȦd Jlp|/۳c|&oGZǢ6 n<%u@'5<(!OY&dHCE @d4@%TAHF$ m2ZDɄiD|$=,"Rڨ1b0$#KZ;Cs@'֡{s&;e\hkJ8u- r_5@k_m}3 9@hMZT7c+'(y% r0yW94ʒev֤l!d`f0O M$xWZk2@"E"shpJHP_D,ܓ(e$wWLJhGd1@9w^|b3MʗCHQba*H+|{֧N*| q*Ɲ}yR$&yfvV+oQh!SB􁁇شvP+'4Ik4 B=dZkdC{LV9$!DDbAq4Q3˱a3'dž ߎt\ *^Pʀ"T:W Tϓp6ѓ񃴵0mg{ԌEM"̺%$M7)1NՇ4) %(BضK6 c **to z6k 8|K߲.(ҭ(=q(!䉂MZ76Y-ʺ:"N3uZK[6hB,uBw.4WCɭkbnZIlf<@G+"em'Ҧ)GʁK ؖVy)xc7 [ IMi#C礬yI ]EKlːS-`2c i3]l]9C iH ln뼪9}o- ;㖒؋{>G:m>_]q@GOPxcO}f]i~e&o:tDbF6#ǂD!}4HƚJcRHƕQǠOnjR(>[Nl\DktHeQUpL'%;RQ1Zj]9CZh юy[_f̙'wBrƚ|vI] 牠F^);槟_kB7?WB6.| k{Em+`y:ȡw7\^0 &U㼰3h6+iN EDȝWDj!wʱNg=l}zC9w -hgȭgbt>M{ף^zNcHoju=Bg9/P[bg K,TjL@P1Uި. 7J[׶kE}B&ݶoΝN:_{z9x*t gb4AB1WQ$` P"eL>[䙳sLw]nuLG2a~bXt//z~[ J,x{A_3g'[Ul.H:'= Yxil)L lFźThJ)m p8e`~\\l ?'֥iqpa"2ˮim40>9ߖުsy_h{U?_ b·\X%_l1f_S\יLs0yY)`5Amj*wDW=C{7QN0ZpYBp"A -i2cɦ0 +sh\)m DR)-OBt>a-N"&Iiol,ًܺbv3ruX>lYW[o_= D6tP`|ڥrҦgաQB[d! A&gNuEb.N0cUPZ^НĀy;+e@tǽ kgzGSZ p~z<연>y]؟Tց1A5"KI!d0v*;=VUEjkxe5=tBI"ldcN/l UtJ'EaA!S`c6& z0;~ ZnP-xrqxKG+gj²5k-g5kKށV@ߤQ<Oբz\uW!G_ k8Tavꃙ/].+[w4yu+dϪ ccu 88BA(N>hտ9$ @›m͏'&7 w˄wh:tҼ]?CV}^,ggo __-? COfNw_jkCwPoGiyhq{u󯽻_+??_~f]C Xq1GxZ&1"f/g0]OasHl #mFmFl2˯,ǽXwjgc^No9%nuFUzY=ԁ5ߏO{ٌWճW@gw_1cOGi|w?_?N*yOW9%M¿E{旯ڷ[ 0o~.D7,i v٧,ƻWEzlz{-QhiWlyx6JGl +Z.Ė:qQUc ^JK7HZF̓ %Kh)g0fʔj\1 _RՒco$UɹPzlJam+ !.re 1k!1VFdkES`^U2n/.;Ͼ╗q!IA<@M2LZdUc:~yV-+4yaƿW!|ܥiӮt#'*?~%@!^)1VWWy&y=5y{|_ Yght<ɽ^շ-bm~~OJuN{y gT>4nݾgkڀbb6z)ƭtTON>(xXgwN!")yRCm맭mPD),%ڬs1"QN)Y6Mp9]fN$-0˥"kNyv-jPF2iH9*eH 96#v"jZO?^jjX,kȘ_|)3O6xɜMu' -!u&@% [ccP(HAxz˷tܼ#ApQHDߴsх ;X:LwQGUL%7:/g+=?r V{G tA׌5a3"/ ϘeDC/ཱི %?=[9xw ھ']Tvc}Z19_]ߔ!^ѕ|Lfg 꾛ɇqM|Yu>'w4#[ XHC5q$h22 1ZTT2XeY { ;HɪK!5 ,dhWbȬ92?{WȎ\OI,~Ad_.I)$k˾ٹ@{42e=g]Y%vWzi_7x]w <k)l(CHVBYF"/琈2d@yD&SzHWuܟ?bF7FwoBK:&%(*[m [6&d &DSֳ94Z}3`XS=  k2o lėZx5u ݼ]|pdf$-QrEȺdVϤ8Y#'{'bx=_,c}:?+!-nf_Wq5FI>yM^I[k;xR:2ӷKr=?&$<).~J7F//4^Հ+g|> !.”QT% ~<<)}u[ZupWrV=[ofm}OUVa)tZ a7WC&{RZ]]W/sHi3>/KǺ垊ePuEoqy_ ֕}Qm|!0^'_|ү{ZtE7i]^ V*EUGIǥ&KJh?O<Ս&`9T b*,"KkGxSṬMA%WG`*ݠBv7G~|6! #Ӑ`+9[D(գ)b#_z 9͘m֩G5opb(uiҪcP,D 3!c':V fNJAX1VmiV(dnW X*B b`CͤƦl*G#yvye+0ow[m7o?~@{VfKuVfTv3^N{tړFlPm]w*ڿfO*v(ݪtC/z{;}]] 0Q~?`̳f?=S7ӧ-N੫747).o^QvTﯰ˼F*Q67Z qcmh%rXE'g_"F5( H@iࠓ|nOdI%; :-˸[SIXMx+t2J¥bumcX QQd{* CON>y`T2RVeB`ԡ0HFfُJ3,l62B 'а]]Uf5x<-*Ѕ/WGlJ6tl*^ifj{[J2:KI`  Գ[&,"6x(@J@ TM`d)49 k;(Y-)o͚͔H̡vq,jƨmGNgp̊ aHLD̘Yo;ۈQM()z#5HfRNc e'đ1?!XTgBcaѽl޺×?m>Z~y%MVy}oK7_n"<İ:Yy!ww|5>&寓UY{ae跠&M}=d Kv7ٌi SiI?ݥ~yPph":h; T_"6Z1Z3JZNIk6l M\..&wӥ<;VԻ1݂Xb(e伏&G/w}e4hqO^abߓbu!]NYXΞ|N=~"}@* DL] yKaT݋~)G9P0ȉ4ŢEٔh5*U m I197&_B>k"?auh[apX9(v{v{q{y4J|N9$ 5N50d1;mҔc@QJdXg4)%fOƁR$ $ډYs[<qX[ޭ} vEry^SBQY4 q'Khy7͛777P*PiT:,SQIFIȅBLj!ylJ'$V]؜L6ա;JJn|1i_%)]ͦl1 }-)TY(M*"+4!:!#!YȚZ灏K$67(aHXA}+S-H1.;bVMv ըh^kLы^d7p*+犉0gcKFLērѧ:R&:. pI}s277ne6c[AZ 0<3{73}nȽ3qdGoĎKPd;/f_WgRM8:|!.F$it' 3Gݣܣq@8hU9gz8_!2b[ŀg2@ %PUJ"[HJDenΝDe fԌ:cz1 QxGl&ZG 5*uB('rNF"(ڮpvЎ1SY*t-|&]y)S4>kAfE Mݞ`nc 6SBs5%^ /lg5FgA<)N2¼|{i”׃6ך OT $!4BIHQ0(-90YlL&$)Q_~tuޕ&T>ѳ=5Nɛ]}'#VWd6O~DOpHuyd9uz*5@OTpۤJq `kK8Nm]q6\4y{ǂ?pUb_b_,b_,b_,K 0L}Db_,b_,b_,2R,b_,b_,b_,b_,bxKyɞT`kɞT'I%{RɞT'I%{RɞT' |16l?} ML"!T":^y\(a"W̗u4PTh`b! 4ȭ2HFLzN @xX* tLǔj ,5gA1gJ:xSυ m6 j9 >A8FVoɜoaoF/|3|)JPwڑl' :A + I T❣}z>Z&>|W_y`Ts95g- 7T*& ELlHGըwuTFz!I[]'Fo#sHED *1|hAGw=1@56 A3hVA %Ms!ol8Di+6Ӂ_8fi0UyK'äspbs ;M #§ռZƃO'BJ E@XGhP_@Ժ Tx%z 83+%TJJ1xO1Q":x?Z[<"y2\wcQcg/9^=A 1IO9Ϣ9G^`NBZ)Ón|t(T* 7:h]G!(PԶd_A`♀/oTfVt(llam>9 FdDH$,ゃARJ!w41qi+nOWl,~!OYra:dW*rn?k.,r¢qL)0Fqh_eL߹N/io/pYYή;HSҥ@oC- Y/n¼rƱ?v޷yKzc󌘏7#4IվMQayR֛5;,ouy:u͝~.Kc2r@>4-WSۖmF,rel i#]a=kㆆ/ܐvP}-e)Xkɽ}vl&!%FU^L*}p}L*zU[Y}pʓ4a!cj=x#=V–Ej69L`4^Im;C n廃 - 4/ޖ;/V<64E#Fd5xp|⹍J.f8%Uġ92X'-NKYj'phYn\Eъzrd8iՂI`w0>W,TŦ=]1|F.#0VGsV1@@ES" Rxr%=g>0-4ǼPb| G '764sD&Kރ4% Z g9?Ē>2YI7TxOWT~x}߮z}󣇛^e=^&wJi`GKݥ$O:>t1A1BqBHu Iz.ʱ UdZKB*\PPDHCBLZEc>PX!c?FW .?2GJ9<ϕ>z-)8O:{0oE%*u?r5=մ !@ED-={lJ%YC$ɩB@*DF* !KH(A4ؤr;@)KDrHDc`Q@2GˤQ *H,bgS"59SW4\VeEvsqLqtKswS~i%ML:W,ԺwãVl'c0ٙv)5( day; ;Dy8uIdL{?[ NJ*/u/nn9\Zk;8ϧ(!UA ZSw)tFf$U•}m^͉޼;9xR u4vhԔ"fe'++. [(U?v43{=Z]Oޭ ܮs}R{q9[[novEQGmմN ۋjI=aHaq0y)pлY,n;69<9FdSs5ȲN!.hvr8䦯&ȿR'k) g{|L)KҨ3Ycq'tGx߿?t甙wO?7\i 9o̓ۓ@p]۶]s#v6g=_?Ɔߒl-Xv2;\}a?v'!{+,쒵&noY}2nX|3ly &VC67a5~_Їp]58sqXs Jڃ ^IJE0<˻*qI1DZH,JϭqXp&QvW(2>I=I73L Q-M2&Q ;v㪵3պWkb9r&qjtgهi*>L/är fA] pa,<s&JD5z& 2V NZ*WhpTW:esU^yjcEQ1@ I& AdXFT6kc=Ikp.* {GWj]<*- yJ?Wa1 VZ Qx B5lRQKҶ XeX$$/XՁUqyŪ8bU{؁F57fD&\`\A"Ej$.Uo R j$vR-R-GUSY^uY//ƣ',Hrd p74D' kOa4zAde=kbJ7q2{/o[B5jx.̽ڦ}%Uu|3 u6^Iy A]ß6P8t^!UF )ϣzc> A4EU¢(̔ѳ"V\(;d +wCy .([ԓƒ7Z{^>3= (> MD e i<G@Bۨo섒\,﬊Re.rL28ld3):5¡CJ).v|YrW ev:omU\uc꫊f2evvMz2W4+tc[LePJLM26W TuVT;3fcg3VN}. b6efDD< Zύ1K:ɅfValD)g1@ZQ' btͅ`@dED0B:3uS/aHXobi KsI&9b]eٌL\,ȥ}&%["řqQpkS;{FEH3pR[o[!!r4>`A!}}lұ-f>ٶ<=@X s\w~FяT\d~]Ia}U!n*4b TFwd{vc ےXL\QLun脕BSMԥ);&Mt+l-qɈ "oHd,< %%c'aRh m,B$ Lc,-g!)ܛ71 4*R1>X_1t?L֔~(/}5%KRd4wyŚKBQ?#4ՄҢT޼Ǧx_F(+ʠ4QoiA8rę"=2&:AMsے]4H |N/R#"1)X,HuQ`!GcNŽmP4TJF9`R;ˑMTR#*#+TT!H% M3Y6vzYWԬ:A_Cz-5EBT^zD^KR:I?i}Q4!X55F q8bZ3'B;x05+@?Y f od~l u\O%`{&*|$cMTZ V`R WIc:}ж|;", RAc轹MDM> *=`a$nv*>} F"9camlvځeZXu@Ado#Qǥ(!PڀCɷ 6a"[\̇bBŤ(ɟ9tfFd\ TQb.8lQ;QE1`>.ѯ/r2 m&je,:̟& {uɪ͑Y<35E[¬LyЕWc4<*t 5K s2URmҦRGUjE.%y򲚵xєnwqY)ųΌy13)H h{{κy\(llV4gԨDͳi/9v" d(-982wdIa,XJp1sH,&nn4B,2}v1bT0`*!h(K()_[Pt ^޹&M* 洮n.qe_e}ڒ/hҪc|TߥK_W<0Te_ڏ*):TQ%-D%MhEW^T?5?SaZ}'OX5Z`$WXPV럊+f R< < +п*XxZcӒ 7YҶvKrZiG둫ֳ~.'cD4*7UpW7!ʴj98Jq_oÀKYJ媎w*ETm@O+Wœ{iEMF'^w}g)b,ڵTk,M1tP=WAzlxaO}Gs1e02`t`YAYg0B3: iQJx!Pz՜3Na ^y|2.!|L]Inݹ> ׷ RuhU;'Aq/#k e 8Y(RLb/l02[r`;;{tݼI~tjpifIwKR`#pZVzy;ْ|l>+Kїm+K%=1_^oxo}|CJwf~G?kͽ>Y4b\'4pZ)XoOZfCks зR.3 ZGK!2f44et:FY^1Tk%i]']7X1Yoa"u\t,\a\̡].MYPo W߽}\5ܼ'`):~i] - J.5wdgq`3=x'?y!K̵t Zݫ+y#/0XE2բc,{*D2_luJ ~ GoƎFG{G˅p'r`bI{( ^$'R/&xBL|0`M1H0 *R&eEfָt6ZZ mG:¦A 8[ X<'gG}e%H=LdD҂8u9+ {=3i#E JjID^!eX]9_J;襱L 0$N2Urx4lpހP#8 XHE8 CD{f*z))᠜n蝅cB7 Ƒ0`5wN )X%<9K:v@dws7/Xөǎ? <*o0ă!+挒)G1#0p`qqL3tBy~41)y9FX AY!bYB`H0y O2׍%?ux'9{G +cS- 0 oua&  ϦKt_7Mk01/2r{vCAh d/MՉuʉ;w 8>6Gv g_x/#O?Yyc@pOfWU8K9~aJ!-f!BNuGū-~T#>~zkYe8|(ΆM>\ɭXىPNO>Pd8]!{3:k/Uxh ˙fEm>JE}uwrAS-Iť7 IIV`87'={`("fkRob QmwgX4YcIg okw0^jDkJу]7]RpnWxMu}g+ǚb|I4ۮjMԆOe mo mu&7eT2ۻ:TT(h,ݵx&}b'ѼmQ(UEDs1m_rɧ埫 [wzyWQ&]nl2h|ic<<`OmsUOfg'ñ_L-fr|+>e:/O(א.nMt~ke?bҷUNb~·W/m:|$_>-b4;z1 K}*5Ӊg/$1:&Xc4MfSB^D>Gfv2?Ջri&ʹ"STԃ\ `T~v&ӥZxjϟ;4]=7[ H * f1wszX`=$l+6>'ͥDSwEUN yJ_30Ȕf/sO^Qz3uh6.7мz t6ؘ7pRSMX1@M⽡&DUڑZvoyaEeٻ8r$W6ie`z== LLb*=_:tYuDIi9mȖ*S$>27v zlOG,gn!כM?vTV;BtldM"Զ]%ʓ&BH-.'/w`x]G(R'L@` )רRuU΅Pc@{S X={6֓baU=nWц1/pWцpjn599T SIwꍕF4׷l::0@W" I l WM WSo<^buƗb=bbj/%e+]|Bg|I*X-fktkMrzHRI0fU-f"%B6:qsLWer3dˎe䣮ajcP+ּ[-KY|ܧh^3\XJ pࢰȵ D 9frFmνI. +߉;uQk|%Q);6[}}/1 EhK ^ cIئv<}>Jw,}^`c(Ip.Y3gQh-)Wur #OjYĿZvQ$S4y8yDZ U9VU$hJVۑuʲI*>iҥ,Dr%d4P 'Ɣ 7ȀuMvmPƪNߓ^{W(-=y'lmViBFŒ[]Ri{0k//\VW{Orw˄]ۓs4>YԽ4Ӳ~e!gzdfiy͏hYPôTNwI7}:>s*xۥe Sxgп;L˻mFq]ŕ~Mpv`y>N:.(Q;b'tr5G|gTj]Oƭ =Ƭƪ_DON6Jfpjg}^lz8!Y^gW:2LW=h;+}p8Ž9u"hdq}q?ۿ/ :ѯBxh|?]^]K!] bZeUָ ;_?_gy+|^KOau] ~`8x8;(&͆W=_ĚWcZTw mH[_uc;'J>S(Yk0!n`ZXV{Ai&{SBH S80W?Ƹ6rϫ"#d0e.Vg %T+/\lcCK-ZDa\B+N;]vFqM&}Xpwr3r&5[t硛=jaz&R$'2ŇV.".ώjߦHJ΃3 (AIdB/Pdq|vG) : APHRt2{4]1S ETrSQ|Ю(KAu?ht%o,72n^J'HL{n~:|:ܣBcb\ LVw-MP!P_O R\7YO9Y[ʊl@cITEIy(ݽ(O3?[z󖧋<ty&~ZTzQ:X>WX:pE:f.R􆕢QOU =qW[zk,2QRS\2*T3i3opW4]d61Souy>Ù FcC(6,&m 8aleF5cNKvjY׼YS)(9 k(_-PPTXQNZ_11jmXQU([*:k4T/WP{nlB,R2? \A UfC; 7P1ɗc1q1S(K̒^EνǧBJ>og=GRj@ &=`v8Rj(G ~~LsĜ !努%hJ0- RDΦv*(V]M^@m!z~:y#`dc rnAЭ ➝hO?VdhugMF8@ِ %EN)6ٓUdUKԶStv1&0bͬs ńI{o-(U-*unٮ秧u]Zk߬~}qKu5b(7Y>lmqAS%jPT)#C41lj9Nq1["9J: +$)W[91_u@Ɓ "\I3'fg *#3eԐFdȊ(pNU@!hٸθpvd XyLk鼑f/VomKϭ;4*քs6@^.[^;7}zv1osY)Z5 7mvZ~y,5 6aJ!_6t{XݱnŮ5f7M-_߼f%boܹY[JKAiJoix]Z=#ސTJ4Wu7ZZ?wݘ?w_X9U&nLY$'DnhA|8;px"JwLL0$z@ 00hЎEl`}$\(Hp5fK)5օ䀠lts$({oʾdxѠ|;!kWVDI%hc䔢V9*&S-ő/&bG]QhZ Djl{w](7`!2vޫP:+jlWTp>XwƐ??[P ̟=m~f_Mw;e}@Uɉݭʛ.+ܕ[kP5%₯D5glE Pmp %XO56P*&$NLUTom8-c?v}ml qj o\\6d&{qq YYz4?ttt[a^{QPKFXU`Pj-8﨨) *g3LC#{ f6Ak M$_۩]-\Uw[lnP̱Xnұv`7flH+fb*lΡf%DXਊ{9XU<[V|*9XB"j]bUaȭJmC`zkwv{8',0"v}-ld'!f(T.޺9h:BZ0P Nbt2)x{-EA+< 7fcu^yfg2iJ&mCɥwen!GW.VKc&%{EНbd7&.-)AB3%.(kp#1 ('vq,tk;[LXGoE9nrEBwGUj KBpGm*ˢn^kCrCdMn.T=>?"ߞcYiT5{9:鐽%M!]P 犺E9 WkVLE/Hz$*}q{D-ɥV0fc搣,caZl1cr K]zovNbCGsxeM9sD/?l=,)EhƯzbCPmQ`ڢxNv¾ KU=h},@gC.;+9/lxR)`n*$1& .%9]ɋMR7 )~"hC7iJ~kdq!D ;MLΥ4n|}YܘeN;<>M& e9MzOwàjW il8.pCz kuV2V:e!OFr-@S ba-Ѐ0H6M8oZBLנ2w[ЇLJ+7ިt)W&' u(]m w_N^ui0|l] nvg 'iu wӳg^V+/ҾVmRL3 ˍmCZBZ.^Rn4-6!D! $ 3`kczPn!7:*'O۩ue^eHf L%r`(0JVA凕酮Xc>8eBl?}sGՀN&$ؠe`MSDws7MdzsS'G)R$!HOrd 8YwN\t^蚻"5w *Rd4dBQJa'w2p(V⑔3Kck-X@]d69Y@ *RAj/vZڔ 5 &2-H_Z9}Kxjz~ Z3?~4ըǷwAln4z! mf^PTl i@OH%p>|6zغEMKjMyԍq2>xAMvFɍTY\4 ,8!9B\4'oEbF 7(o-zoE׶Z˺uqй'ĽJ183OKp\kUdh |0T C%E\j䍛[c=bn.m0 aƜ>D cTDiiJRf⼵DJHH֓V%#o1\܉lpeDt))spR) wf"(5Օ*8YR兛?`Yn_QQ%O`!Wf"3`.t,xB996ttt$`T._0>{?0}dVzӬAz ׯf[[Nmç3ɗA,7t7_s6o5uJV:=-[9n&z_:^bRo8ڛwd6> z|siAy}/ ܗEyKzve0^6u4>N_W¯^eG/ެ))fޔ7;>K@8 _):o~ǦNVzFi`#$DG[^ѯ_X}{vijkq<-2GW2[:qNbRY\/ۍWb >A~_ys#LO+\`y{~X M v9j58"Art(=`cg5!y҉kB]E@ \hȈ ^ Y(Bqf΢R٠9OvûJrqѧν_ݲ >dwL=^>QMO9xT;kN$2!#PcLrL2\1#ЅդLA P!0+Hk9 fl:%"څH՚{/QWaR A;et`J o+'HL<~PC }[w=fV1[R斥LIm)[bW' |ZTSjjQM-QH6ƂԢZTSjjQM-E5ԢZTSjjQ-NjjQM-E5ԢZTSjjQM-E5=&0|üE5FԢZTSjjQM-!S -! CA-L'3"O2 ʉF ct# KR9{dT'RKWN-_ZӢOg,QH F;DT, DrvT8̂ g&kt/[[&쾯jHKxhA AL I})f^m9IG2B9P$"1m5dxs0( aUaVbHDLbE2k Ѩy6j +d||j~XxjqG4=iyĭA hFi'e,P10@ Gpoy](GtQ8))0!ZYܘƈ4Fxd!Zsv{ħۄ:_,2i]/ner@$(uEJq|@!00B.ck~~jkYǡޏ8p.bz%d?rꨠ'#^ݼ(d,a(u35 WILHI zTI,cHW6Jgx}=k ;'dž.`Lݍ|"_Ҙ@VD`i­DzHgБ$:sA"snHdTV,3t{b2$mwREHB90K%2-BܪH[L2G{TZoX7>p!$ؠυ":te&=܌kuX"E$I 3' Ή aݕsW٫rW]客VB&ZbQ5PaR< \2 ծ'eLحfK%sFMNgBJT"K4j:6[u`R~@B+ϳkzx;  f&uqhxcoiokg ixBxU1yg,T\RQS"RubGKXlj`#y/IQr6ezׇRiz@&&r2'*E^tos[ھٺtVV*?5 AH'%8x*2^S>*!"j.z 5uO-1˒jwJ6@B 0=^=Y}#Ibuw^Ȍ@XDf JDkB@y:x/UwBt/?s}i])~"i]Y 7c̷ !abFX`[N@'R0(OJw10<mT(SbS`IT= JW40a.Ū֜=c_;ҿL!tb)vqx?Bo|گ?yᵬ]?仼>@3,%"p^(7B;(s9,0jh}Z[S¢SK! |¡.2HȄẤ+4{ILz]oFW|A)AiZ$)Ýa쫭,) ,Eʒ,ڲE۲crfvv&*͸Jb,A"A$룷vcPi" EYMd4얰M?BN,Myi, :,@K3۱q>tt_KFrfdKnKguK(UE0 ) aȔ3f +HKX&&&ѱ>*` KBf'J\ņ䞄 }AFv~3㒞!.-wٳ\-D撋/yD4}&!IM=w%a(ƅ3˥S"ȟ/{{<<\%"vUҞV` $"iQH az(©)(^輺W gEx քڛK_XOnN?a0&1y72>d흄llJERcd>ե<rARۨtYݧ LL3X+S*9Ƀl雪٬yr3¯/南gEp*8oagcK.8ՆMKc3 _2`mOD4N!p"|Aa8*u`Ŵz>UǛ?{?oUi!2:>?&.=f;!|Kۙ돏7M71Q_>KXu#0 ^# jk>lEf[p]붺ꮩbtMh_] jX - ˾q%#Szz/\x^)wa5 i[ }GljM;/zEhHD4ϗUuT! 0@_Mڍ8Me [r,9QL$N)c&tZ>dZ# DZZ"OSuPUcLrk38#u;laz6ޕzĭ2-ooYmwv|W1iS4\0=L ()t'S| aL Vsc̱|:H+=+gmJ"kP-9+bxЄLV9T]qN 9%`a[=xu7"Yubc:C6) ݆l0J g_0B~C6zY4 rUigsd8:l/uNx fuMޕet)ʥTz Ҥ쭽ٶB䌂*gRbr𿠹Ta3*J1;=LzbBG&  ƝQqƅ0;ãcYEk%T8 MAh vK)ai1g!OaNG9{@'NF_“;..uaїa 29?5tjO~%G{/X?$`mc@k 葌6rP'oޞHVk=JċM2}\BK|7cPL$ hj˦*.Uhr-c߸Ώ8j10 f1grHxqER]y>FzV?1MLryӉ5$32f4j|0Q^1Tk%Jv~-ȍq5N9GHN-D(fRHlÌ0Z-@([ۤkim6L"bh*/[M~ׅT"o}8 VGzD**4BkUڤXI8>^ӺEv}Z`V4__aгwj2^`óqO?۟.>MGn4߂/6*{Sj.`ُYH\e*̳H33ݰ_x&°^%PI-%Y+M+AΫ`]bJ)X˗ zuq2 raS\MB(hh4~9gH BZNw]̞z6(>)h1 $|E1Caq;ݗ 0T`_5caaV׳74L׼ ̈́zcZ|E2jgBaNWC" J;hh n%o9(-,;9Q<ł@Q)HzM /v,fI=6Z>(ΉSDŽEel("(UpF(GOe T;KQS[[F_>}n[iL Sp{PJg/sVag5ʳg`Vy {~f|^\e ^>d hLҳVyOxNGGijnrzsB 0u&Ŭ!'>K|C?o1Zuy! Rv .gp_Ks6[;X*wz}71+<3= ڇf3lV;(2hZBjev8Q2bSj[1^z揷?x(,D}I9̕7>'v SM6ӹMz7IFЪNh[[7'y*%7\qp9D>3-o^wy1qqr\ٔZ(ڜ-rm|̓nw1:dC6%%0Br;v&Ȇ$}լ[vd:n:#f,SλY~uIJ!\` & OԳl=i-> G=dm?<ʟ/J6 kQ<._b岝{xnAD.2MCK*I :f`sU5 TFY%.r˹ʹy[%-QUDߘj ++K<<}`‘lSdĂh7$28m %%c88/LZ**P j4!13ˌ3Le"-Xn ͵ɭ6fX_(oN]kʼyQX5tRަCtWzEiQ\(KAe'$ u ͥvi)sZE`N+G41 Inks Ai+$ (lhKEbSRY(׌"eC4( 5(* %)S0rIK@aE& FT8G"W BJ(2YPJ Yz K-5E!*/=A"SE1^kЧkRT*<b)_ Wz=opj.Fpj%Q 2 qc<8GT`VrAg'uv\xwF3iB jk$A030ĴfNȉwpaj,Vȁ|hQ8ZQ;u05eO _7JgH%T 6e1THDŽ"VS)+01Z=v6 y: ]"&|e> *=/I0ݸ p*>} F"`9иcr7t!ܲt࿣H 3=o8 M<c)ϕ@"rW#7<*w`MwV~뭣Q6 j>ˌ4їoC1T)b+t@`A2sVk/;`>c׷S@v σd/,,z%1RH(o ;>jRi97TʜirK1Xl?[h ELgi-K#;,ukOϝN[CPTZ0M0,H][s7+,3-~qUV Ms9@"$+8'x"D铷Q -I#,﬊ReVgM>RLCj͘ ;THi/b٬gp3מ~} 6EmQeHd%T1r8^8B*Mr^9UP5"0ucg9vZ^jbq D#'rxwDՁW^IdX 1L 1{{KH %$YU.-; gKqjLǟG&.8Ɵ~!g:&P|in8{X\\ |b޺hZ躌7E6֡M,^e"T.:C;o+s[WkgnWαq6oTm8ݶMye:59?41ٹ<`ShC H *qA8" 7ij7XoAt&{iȒt0_FޔI:KPRF"̃˩9E\TPTQ 2b 5 X`8,i c(<*f\rcűךPXQ g,xcq|} ^amo@qQy̙RRρShR*pr~fR*a+pe] yOr)e $NBO &!|*;NTid,6Xb+X(:,+L/23nH8$=oYO#65ըZU&EcTJA28`^IJ)I4dc#{YMÎNhc6 R=~Rٌ~8PPtڪ0j;&Fm&@$OF ƔQPv<\PĹ$vI @+:k Xd&($Q$FuԮvlڨ^bCAb+"ˆ;Dr9AKe7z"3&xnIc ZS$EJm(6)M9h3|0"Qͣ -i&4w+FbٌN"j8Yx.^g1)9-[g xX"!\[. )hb>y mDȠpqx0KIǮxXx+$c\Hޏ(d7?"axJxAj'ΘJ"+JadԜ >IWvrgFw#DȴƵT)ZG gfJK/!u":в{9QGK"ͼSZ\~fť7)O\ 26l\;vH(34R>KW?'(hDbJ8+RW-k}ݾvq!̲6jPD;>Lǟg7u̴u;:L%x U)PV\,dwkc2jΌ\Q#SCJoELj_zEo?.O{3m\߬կ)9TX YpP18 !յBy0?3w[NӴo7aʑr=yqC9ifvw3kf^bn<ȋW/nƋZ:gQdH}.}pZI~F.^|TKы/39ZW{Q__5'{WofG58|$g@xo~CQ.WXh%\̞Mug&f |[Qqh@+ڶlt5Uza)7Oê}mob;%=˒?f~^suU]+ZIIp[uuZꖓC[m,{y}*4Xpr xυrT"a,SC3(y鹕k"K><(68%v$y| -6n6ͣp"pu/ᬚ /!V%G** 8Z*Ü FJz3R@9م`)P97Q@µ?mQ~K{p{˙Ÿ.P0>>Aoz bx'9W?{7 O<[4 ڄ~`RVNf/j2ƤZҎR=RH7JR^iu9_Jńv"bve2W3W)JƟQ.qC9PTh`b! eADQ$3LAP6?"qf)%3LIjO9jrPڏp6o!&s7l(59Ҷ^jkesh|Лv|\WH*iC?Ey' :A + H/u[Dy홄Ο>fhHadbb,[mg<}#vo jd؏ fg<4O]S@&mu g글#SF@HV^z[%&sr,v3n{\>JhiM \pR+eB(4\ "tJȉ#,JrkF*#sQrs5i;<{9(SdžaCɞO*b#Q=mũ6+%FQDmL*يSYD$dHݞB$\mjwf ~?'󓚃6.Zx ,n$!9H^*A)RH0$=΋n5sgzQ:xwE?훍v^h@ P$x !CE8L<nGNk΂_|8٬qa+chf~0"Zo0Oxp5DHIi?푞: Zр4B3OOF!3+%TJJ1xO1Q":x?Z[<"AL]g?vmWO^뉠蠟gQΣ|?DB' Ae:/W7(:hmG ЦA݉Nb~~%Jd>G6=.~" ?;M>Q?K+&Oo#׋~^{{/__ :w_]RrĨ>GL4O_3^粓g߽|pc\ WS`-0W=Gr |E5ML#g=^V>Z z D.5z V~a8w{HbpP{'EↁQBRONնJj @-NNחAY5L gkқ)ŲXݿ9 &̗۸P~e5vw hs~l+$u|gao[u.7*l͍U'͍jk{l0z=BF9nyH頂q[molglGwc^&311l-67Yjȃ(w= ~վ\owx>K%U<\ d%b)oƨVLJAZzfqv{@E'dR?rGaט\:D2'K"!D2&85Tg:2n;лe 鸚hulm7WnDH(l÷vS֞co'uڏmeӎ.4"kOZ EG3n*DP܃MNXE2ׂr$k$`S\y4zg;TCNknjg\*1FFN$,J˵۝:JDAƧكٻ޶#W.0QF؞1i#QQX {Ia]mDad^yZdM#OAZs1(-fe^dFΆG kr/0waѾQ2KSpoS>zZu9S~˗J]n{_oU= nOt_ {ؘ']vz\u9yiS!b*(b ~AtUq%l_qMe]Y{O5eԤuPJ$ EC(FsKAwW{9]ˋeb,5u/*1U %քWZd0NEOņP%ۖ>7Q<=61$$Fs2A"qҐITfIphH}";| kH!uN8^*<9Wd@5\7\]õerWBwYONl?K ʪy[j+qr\j9.$˵V^jư8 n[{XN~԰zҭT_R>"Oz<- Hmg E9;,$$u E9g;8g` ͪ#y&M~zr"ܚ|^Z9I#_J~͝?\$L ӟ(̎Ϲ>[>=98zzn+ ֡Z玝[bթ#IO+>"+|ɥ9-ߞ5-S{5h+|>uc[y{a6k(c_Ϩ!ڑQð梲!2?|8ظ"|lM)qI:8WuіF 8>]I[ ~*w_̏ㅐ7|Nټ>{p-OtG_!7$[74 ~-0l,?<8zhqA7:}`-2:N`Ƀ`wQBv'!3G!S#g㔽 :^_uWa[/ ?xٓP{V.`ܻZgi~:>ێ`8o_?n[8=.ۑWo^nEe]vT^XW2BwH)[YK䬫yW =nY҃oK6嫗ˈ=g-߭oM߽ }ul0w%M9o3'ڤrP5E3Xm\-2 ) Gŭd^r;k7ZmjW"g)4;9]m0ӭ˴s^?3 LŁCiݑ7o̚fE+B!īٻ#ዾEt;T4N2(MdEFmd2\ $4s];e_g˫?v!UwqC+zkc}r:iE.jhx&{*F Ɋk[w,& 5IJ_8Uy]֫"IWh%X(CZ,H؆JYV\2B+  1+I3ەw^K-y'rdƤ֖J*'\tҗ!%s0$*Ra*u2QAjZtWEN!]9娴RV H#K6n E=lZ\6ХHȆacUHh30fdIa&FkE„D !=)`-fiҢkT{"1Bީ쌅G޴TvM.n͖喔JDf4W5bn6]|Bp_o}7a zЖM c9ZԾ05fxG7nQ͓(eY$TDٶ!/JZ=Bo>yD}$g"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""%:H L_ jH ɓ'[y I { ̼ޏ D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$ @$="IuX&ݗ&['^z\/<->DMA{pI$@_GR "N WG=í!H ""H ""H ""H ""H ""H ""H ""H ""H ""H "I 8ɿ&HB҉Ce'KaI DC D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$зC}Z/ڛZoog:mr;~P\w*%/Vygiv5](ҽ7ݽZN.Z$E8s]6fg)W?ꙍyglZ>A~Nw7f{=$bLg鼤4,?3ls1?o. O?%o}ϯHףjB$O|VpZl01fa}0^\me \/ݵlcU}Fv6bqryXUepn$"0jI5bP3%DE`Ě>i>e=xdҷ^lKoe~-E}a6\~ŽG)Z=䃿|_n3J#b~籡;9cQOY.Z,ej8s[!WWdX1NmFY*[ncM/3E{.km彚Xt۔j}0XXyLv=j,t/_N17|ҿ-zzavqG}ৰ<ߴs*2bsiw.ut)-k˷[:-?r})?єۡM8NonM~[7r[a0+")kFw3hVfc1u63VDTjzd‚R.0dm 6 0J.bDI̋V\,5$Yiz$un9F+0q8%l頩B6\,&Wp‡?us@:\2&kkG}~I-]YĽy-ܥCkJu٣ȃrۦ!.CRh;tr^neME}|,gceezym2!Q%9;m 'm~l~>}n]z@v ޏa~3Z_M.J▎k#!Rouɂg鏋w忦o^W nyH%Vmz`uɖ )-㍷ݕV.F` w.ZQ<^ ,#zvhC]kPEzfŒJa>=K}9N̆YQ+5{˽RM9\gҋp`CT:."vEDxᎇ>⛑zԪ[{Eek<44jg;9_XoU+ BIktw>sL1Êz{sgl:AE=s7*?ϓ K*ǪXu jsoW% C\ e+ը4ZU~AQ~v~9*y [W">1sO>;e;>t>!\-W/n?Q{pQY6)E@ RK&v9ˆ1VtCŌ]IFYrj=hݺ߂Y_geݖ9jێqKL1F; ,_(le߮T18\bhVa!!e"G)r<1j˼tK`.I$ŪkûV^M2YaFCaef?DZ֝#Ǖorr6٦t2l&{lzmgpwতp8M.ly=iz>rlsk2^,ɝ][e̪.CBno Ii=vN{ݜy7q3h=]oN-:sxh^ԹC<"8ixܬ==]]Lk*6*5eY~BW 6J.io`GC-."=C(Qz=)enjٶi]f95 c<3q+bT\+&w5v4n9yxs2t  @ȞNiٰ߲ qĽ3itiYZt0mTۿH rXx Sq>mWdll͹Ȝ\ vu6WߒoJvBVl֩BuggUn#X_;BEY}᝷9˫H\ۊ؎9_EyS.:֫;=vMTvq6!M v׈b3}H\] />Agr[dY-ê ):H(CX\X2 #|>1WYBP]l蝯ݭ;>|Ksc=-yD;#7LQ6^EydR@޵MbĘ'~djK~~uL~4]IpBُjg?Xgn;i/uUȢy.+NIK[19P/թ$ٗob/uW6HHǡjI\ |,2U uë!:L"U_#e0}we&&'C)\qJFjRbr[w1[k[w΁ozJx;VμqqNt0ecv#0 ɅrxY nB#t̨] DGx-(Œ2x5[2cW˘9K83 ݺX4Md,d,!U)j'!%֌K)-$yaK^x19j{ 7Zg'6hN; XUJa7atf_H1dY2TmnJR'E=@H$_SY6@NnWwܽZZ S,B}AR-k< 믃 P1zŅ!""<)p?j)zܓ?˘EX[5$/jyr,9 OI !. A%oT`q!UMbQyDODKI"%bHWZ^g RRWW:} >fuBj-\gL}[B$d^ @ܡ.uTÓ I>N ahtp|&jOw]RDAO+$T΂#"ᷠ߉R࠻ R,gr{(L( XD Q03.Q*B\o1(`E  (C4z$NS!=" 9vBճ)Z7ąi n˵}A]t.ow7 i7Eh6u J+n(:<3tnyʊh#VN1lƭuǝN>ؑMLryå 34#Dz4‚SQrgG~r{3i$V%ѱ>_x-ze+;QD=L٠sڍSQa?|tm";U*Lpd=<;nCAt˃5H 傽j GoƎFG{G˅p'r4'((z( iЍF18v.m\*GgW,Mco$yޫ¦dx{W#HZX3gA`l p E4uc=Z#@;py9a 1 c@T+"PqN8O׍'?^bF1; kŔxK)᭰N"(8H1y FwToa-"XEхˀPԵf׳ _Ȏ.x".E>N'?8 h$貆e,P `0#LW D8:ʝl|ʀӢW+V_S+*N?u\f-@Yw(݈Y̻PG%jn2>:;}dOh]O) XތG>D3O?7|mWE|S8`c5޽{tG Wg_+O'fx2z_ִ{p F{mg^Kq ׼sp2>[M*V=;NN_>y0c\Ofcim:1,{7|ۦp2'Ll>/?=i?v/=2 Ϻh`s4Mz7yâ ykFpS`!DZw6y]MeF=Lskxl0ۻ[(!>>jz]bR#xZ YHX \Gn|if+z82/$,YrzYqV]Xpr7pr^>qAZ)mHs,6@^}-2^ŅV/x9 ĿŸE:L+J,؜Oh?[Wm)0MM܂=;$Z8K\l60>bna)x߲@u{Π=Sj\]$oNW .;zL .l+;Ѻ*4{s[-r='6Xȁ2*Iϵ[2Al@[泤~<.:p>0@Nvuc:SDŽEel("HA"QAB}nxd:!kUݯkl 7 sR'2::|k|=ʅ[ϫY}IQJ1IX;VyvCzhuR*NEs<)&L{"Cɼ&|dfd{ʳK{X}G{V4\ZLciRbfbb YlRgt$XhHXGObIX8FҎYSYJy>%nKȳbhጛn^>+k ssFIpWi[ZSxŢRiC˺nmYv;CgU,`\I ӎ#>gGwK@ |0ÃqX:L"TxpFǤzw2%eeTAY+A 8 (O \5 K%BUmeIlcb bTTÉgdu7:P~ {G"V ˤN|Zt[GU8Іz'`uKQ.:08Q,!)υgv-!c\AXdMHlc#L}T(/ q)8Q#X4W4}m!ؙg񟙥!}Dς8^W&"lZ$WIV:=f('MGH-y&g s[gUd^?K3Cp^8*:"{lm $jůIP)f#VVHp$"ZGbNچ!p:<B< V0b:EY٘dxZ=j3ɶQ[duL40e0rW̒%xlP1!4BxEp]Y^g7/=;|Yx7G'Go?SRVԍAx}#0Zw54 MЄv9z=ƊlX+ٯӯ 7v+oҥi/ ?m>'6IWwѶu-˹qq݉/Yb=A#gp~t320SlדF]k(IzG  [֤\zr,9Q/XIRPMvzm撗6ض G cK=NEuPUJȘ3fpJG"trUFZFw7w4Zx۝-wagk-;KN;U;םqq~[ejK>"i/`ű%y6X#Ιg$¦AJx†J/Pݥ(6%ʷai-X/E%tBT&*L*Y}\̑`9#d{u- ۡ|}r/V:Iht>^Z[8#M!Q&Xs;eJ"8X<u;jK-7t7oA1Yt.uWuEU1ftWdcSx;:kgT^jJĩE8joɠ?)~F3d }2l:R}y¹`̱|25ܼ"y@Gi8*aXa@pՖ`,pvs^UJQi*A2TS-Y) LGŸ ;|"Ni嘛E .-ͲeρNagXKS_k!^݋ AZ>$Fe ;PnJ[B5Oӛepr# $gKr1ݹ\yD."{{<'xS̔e}͆xx f][o[9+> Y[觞bgi0XR؎ě^ߢlɲ#ɲL۲spС)}d]w]٬%M{|~n펲ҵkU;ܼA])vOToit;UDoz߾{#c5S(aIAeWs?2-rq_cra'fU/_|SG8]6-)on,u}\ 7~pѕ|>i3};EfOf0B;8p6rMW +WڦKm*RVEy忈jŵ?u!FݼU6}_^Vm,OQ$&(Gk4RTQ_#3aD,0 B >b W5pk [OwL&QO0;\`1l `1l `1l `1`  |izqׇI>ns9u[n3?cݩ\ T49<|9m+iWXjtu{r,{mPؒuC:+J Et{[NDũPdkpSHkCJѯJ8=YL]4/eG(<ˤ_$Ϭ_N?=<]}U~bI.]nʃy˝RE H%wnk׊vo#z!6hiSAVmAZ2ߓpr?vTl';D~0E_v9dFH4}z`X=Z&\%{xlQijEWZ'%bTC}ZJŌjmU]5c3Krw^a}m8!nY=:<|L v&:Ӊ;-Q_zQ| CO1s}>Nݿ-{D*P %4WjSn.*j?(~=̔"r.jT JДblHd8Ae9db?\^@meP> A|ڳvVЭ>2/ӣ6[]G_>A-x&Q JQِDURmٱ0Ʉ\bW ΤmM!ErLZ;#":Ր!T{FQU݆sYo@/Φu]iųW_?6{q 1 bd/,n:!t`1P€`Cc82Ǒ9qOc^dKq%*d+$)U[XU"UW.9+wzlB K2S1%ưdU pBd \gm8[2+X۬jC:>ܝ =^qdz|8OzEMMihh{[w}v>)vJm=ȍCyVq8 m{aO 㭽 gZW7w/y:EL8mYrõnw~EK-7C#?|sȼ0`_=_sӢ' y s/Ds_16/G:̨s qd鉹y,W164$tc/{odu" 88hyЮEl(#eA);s`slJTYԢr1bB>Zn7r݅3Rޔ}~#x|z]{M:v~pǻ'0aݻ_5rs7z?#(gji [HG eQ7`05ԁ3:6[tQ'۳=ߎcYiTw5;9:鐼#$@[ҔHéݢtJ(rMD 1}=O>1Z𓋭jM.0"SHzؐ>YLt K{vNbCpaMv9TOkAz^ӖT]"cbCP{mQ`ruGmS\tաL½@S}B)TCkt-΋8[_T5;5Kvc =x/K !#V( D8DPETH\cZcᑂU"ȬiޓB3T 50\bTuΖz6ӫ?ހ 7oTyVUMUb(@a B= 6bbR3AKe*NGNd]&ъTXTf"꤬9ESaBҜP$Zyz2'-w2F2>*d$=+(v#2qP<cAeFpLx|_ N׆OmQ~u=`^r"p*j LvFMJĎFmw""crC+h, 71JDdߔo0XX/b[J+兺Yp>lte*>WGzT~c{MBvY;xe6O;Hado=Qץk=Qf@Fgr6`UK_TV`C2է˃G--S! (;n5XSn=bkN3|ѣQfJ> W0urI\"3 $lN-`+cR>!mOVs?k(2=ІDS0smdg\cn"G=/~Rn~<Ƚ㈎^5d'2/|1S{oURL0*OB=D!6Hd5-VP bMbj$cSB9R*F[F@Kūm YO~زw>A׾&"[߀&2G.h"Q ՛}^Xg|Qx2~69ӏ%y/F:\`󐒉CͺA6 ;'ʺN{^(-]4ڣBiYmNb8 j2)UZJ(Nt!@p2zw?6nFCKCҬ`LY&Wu Zf4[RVEYP\aؕ]ާ"i;f~:#rx5`i.*xgsa,#>l `ÄW_z{AP_WA`B[U20FJ,bsPjO[7ssyjAv}gi-xzttWLLJp~q޹hM{Er^v=6 xc'4 A41 P[&Pnz{eYZWy0omX畗^Ie98x&DžBJh\Ծrd8$YCdc`5]Ic(=x%yi (?vMv0/Da`&kT}~݃Ni yP*"T )JfpL m|O4r>X~`ii2yJbՖl2`\AoRFz#P3eӒLhK^6$^B|m8[5ݧw#kJB.@0q8L-Rk>hNłvB5SA9SiW)UfcTv sTJsr.uE+@vl%лol 42mKK4ۀ-M*~U,Vy)) )ai1g!JaT91 aR;ѽ~z5l^7τsu>[Z}=ft/"J- sct!=¤BzȁJeü0o޾Uo }M"[0~K(v`mo(R;o#FNj6EQp.W2_S<`s-:O1Ҁ7>+ۺ^_XO_b{ӥl`k.PHfdh՞<#`HbaJ:S꒕S9:C6)ߤ )oCb8BC٧P.m`O0lkBW릾?{jz+fGGZ'-}s7^ 忭אKIf;ϖ-\*[^5lnzf~Mp:s9ً:LV(αCRf(6b;@.Mg(=1C ;q*>-KP CR(7]{`rn|ؔ)-SfcclZn3\/ \[{,XnvhD[eD` :̜==3i#E JjID4N$BZˇ7 jGIblX&ɤLp 9TTT#Y%MF>0?d&xZ"[NϜdr|A^?+xC4$χi#Ͼ, Ȭ +_h?NՋ wfۗKW/F']3hN(mopRTg/g/?UӘm|<.~e.ڃwoyNZO6t۝ktdiң/WWۘ*Y-".LS8>!T~笒;?5 6Ez f5Ru }װ0GAQLQnA.`v.aa2Xx4%ϊu 3g_iůWC6';@" ie^y-~#743ڕ(4/0,$"i${1^بmpr&wqrH@Nؔj]?t}sMBL>yב `X.:nR4Lћkbі:]O `0lXm6]c4\ԚJm_P0BEePƼUiD*1*UE7^;QDDv.s[g~79-r='6Xlȁ'Iϵ[2A7ZRߞ?PMb5#YjT;{/Gel("HA"QAB[ z]+0 ^r띀94>D"/\tTaAqX0BS 5T/#g*`M#DFP96/ q)(@j|G?hk'd};?=4ZKD '3;iw|ib |&dJ0 lb$X.R qǓd{o KW\aVG.~F`kX +PQ4 u&a0u-3\FcVz7   /XK_XMnFiZ;+BoX|g&O"\fG FBG9MWpIQ1ZnRծ L +a +rRYY}uw>]0R4ܵ$IxAy u5 M4u^_ƊYP,n oO{E{TȜc{vN'͞ ح[hUU ꜈5vv]"Dњ0*1hFzSJs=IZMb1yeMSG(ǒ)pJ2jImp  ҁQx$8ST0Y %\PI8clt!\sr4mMZ[Mηw亳V5;#SβU9hĭ`\0s,XK93|AAfxHVuEz)jcHLG)b$i Z{1[,ljTۍ-U=ZF~nJH0aJ4Aɸ)OGg#y6V&P0Yu[qo#5q7ka| T6`(^F,ke~F~\Et is y[(|TTO yzG0cfxx2JF8o^eIbEXj%#?neº1a%\|I3:93ZJ&zG@e:leZaMlfA=-C _Abp?2Whx-74Nw;v|=1iAsth,Kg񾝕PuS ʠǿxp 5.=8L@OffO^Cl*i?0@XA(C쓒/UѫG'OGCzцx0r0Ѿ{aFo}Bߎ{w_^H;P-k\G<3.~J'/#t*[j| ?^m>+:a~t7݉%o(R W*jAB.Rwy̒Q 64 (݈R3kaӈPJwNJ١]Ոſ[RL޳ĸP#bv.è0#¥(v3,lrDԛIW%;7Gf7Gn8첏JZ[8#ZI{!IkCQv;ZG@².db>H׸sb*tvs1}PJ>u#(Yr:hNP?SIyikhpy X I`I X -tJHe<[7~8L#͡n7WR}]s3ev2֏n?d`PɚŞQ:sQ3JWx@nAzر#hoj`1VYE 62( F刊`) NPL25spml8a?b =Z-FAh4,BkK):]1VhvDSb1Nx)ASKM96c9hD*I5^x-AQAIǠZqQ' Ӛ*LG eZ3 #5C&(Fs+%fY9;jlxxY]~wjI\nO[&6MgmK*ܒM[73jRt޹`\Ye]9[3^!r#iEԖ|`ڞBZ@u1a yfS,]c߮kmbZud?{ƍ w26U͞JUN6SN*ִǕ~Ë(Yh (y*Y`0ZP[*Bwy4vjw:/lբ{.5[2Ds˦f\m2m !l9:6dݳmqn2q?AYk.BLF.B:nC$G(ʡԩѠ.̨s C ɾA7Vfݹ]QhK1-uCH*LNzˆ>}I7E/vu5ff hwpЬ,*8k%lVƺzQb.ֻXRD<8,:RVZi!vXZ#c܎J3,l&biz{+9ʌ7|O=}? ƛޠx:{֦ԠbʃWyoKFYg1sY5IhhjETRR*l'[Щm'|(JLEd[툝NOp<n&mv@ndHl 1K0)S6$lz޹$G5$#a1ŜHfRc V2qZSĄײYwnި_`x(L>ED݀"nܴydA :y)Vyd<AX##*Z&L:'@4Ed4PL”knْVAtRgrriͺs;"^&ԉqq\:fR'.la&.+ QA(0RFIIŇۈ A0}a3[) {Vnۜ~Tk>"*VZ6T!V7'T/]] G`IfsZ_q۳/}Q ~2u/G'铳( פ)(fJChYAɔƅ슴@% hLF7;ddc"X@!OYRNRdt:,enڬ;;8:1 M9V"B֎ݢviMWkNnJ8}Gl v#5 /oQ*hW^Pl{G>2`t tp&,- ZlH91laf,$qH M4xг9֠5yPPEEDnzeȬ']WTm%_kl6&y ˍi)1U Yi<.4ļn@Й3+=0K1NM: .kt/}t'w;Y]1dqEY'ĠM C!@ J^h8g[": PNgWݔ%l[Oy_?IfDVt͓@}Xt|=(2Og/oc7;:jN.%%k{f=[2澴vz"_yUMKb1wN eZW |n |j/"mYQ7o~k@["`Q՝Lj 1[Jq1kP- s0t@g|ҹzΘYita/]wbL.Ym7\ d*"-Nmxw]՝#M9$1 0\e!Js.1=528 ℱ45гzyn"Z)6dLdKdr9B0q. "1y O1]bF>O^녖M0)PYL6C6p 2OAizZl]G Pm$Rڲ;NwN6K-@*5x FYgʁ ? `4$\Ji  00O d)ވϥ},FPR?2JR+Gl9n;FW3W]~osY4ThuTwd <dž`G 4z9W?m//ϏdzeB}QOk% 7_@wdzWY>7_~X[ƬWAa~ųͿZ]}y4W˳_>*_Woل? ^N^9>ՇteW~S~1ljmR͡z۳oKt>9G6{_=.s>c6eٸJDfoǷ#w1M_]zЇЇ_v߭n) % BXL+./[dy99{c_.J15Q z>Ƴzv3agpc39=?[;B'ovA0+_ }_FS>`yO$FTntz{~E{Oqi]>#ٿBx#uwiXx9@wC OkIʶzfxIGXlN{1?S¨%1dIMA>sYrl 59oM,/8wňsw#=ɜ~>(F* -|p%3e|Pa/\ l7H jٰ{>n `_Lf,@ZGֵGTK>e iimĮ%SC qD^*C;jsyD!چ..=!]AfnXy sm~9q"릥~L*]X׏ T HSʪkyFā㺚 JrO bP_B^#qR!,J=[)kS$'æ6֤r65)XCWHFq_ƛ#.ig~Je7$W?tqtY {VT{;}J`8&F5l+4DގzN _r^D}_'tw'$V@xsIGCPRv`A)Ʉ *c6ʻyǂR Jm)2'x] OںdHd'Z{+"@UA IUTK1L/cO<.]nܫvuXW֢^y^fl}RYVq7*a(5aE!ET$IIu]He.jD}Msκ$kyM&%c Zs"w9JaQpGQ[w)HIVxMVܴ{uZxiv7\D{BZh[Mq!ZҠcbJ"A U> 2 zW+b|bqwCl/.Qޠ%e.- y4{.)h.ъin)@qyi؉})=V=f<󊐱vv!%JV0 gEGAs7y4~HQA3gxT&*T{{&߷dZK6)HA ;6 ->’(UASp*9 dDD{Bj)49\D#%B$[T/k 82TxL ͡8+OC @сBػ8;'<.eT`SE$?юy}ۮ璐 Ỗq!2hN%v2%' o?ytI~-GJޠH'Gw/W[-7\ YEG<9S:}S;O O_$??]\NzK9R˥NApʭȲ[荛W(+.Pٍpmջ֞уjxv?K7?'o/6E|i?i4<l7$j_eWv:Q4dL\i< l"[$ǓG V6LƣAs/}͛cdM6ڸW,{5[|LJCT"N6W~8@t?~{o^?߾Fc k- ˡ_ɸ ~r~Ϋ6qۇ2y%:f'qWpHn՚8Di(E4Ə VP:sn}n.f[";?hh 'n')wRPrgjXE}\,JИ4*JHSkm-I_d.EI=t#KLH,8:\ .8Gc3zmuZyө@pk ѻowGks&;:AZ&=|S[kM%b1+m*V޹+8Pd-*W̊h8:&ް>-l{` uGiDl7z§Qznѭ{EiR}#2-JZ G@ԣ~ v꾽g/]'_cu%կ&xK-oj٣$x98pLvwpDWn_u澀.,OzH(ntErPMx?aYգֺU. 7dڒ6Z}T/J08o].A-ROUwWbWq_TWU !؛U rkv%gK6KY,sWy]ħzNBXi2gH-.@HaU8JeLT0N.^"riSu>΢"d luaЈ/ٚ :tz8);٬Z9k*h 6$݋gCUj]YhX϶͐wV|5'߭)=9 ĥȓ/u m9/pχ _~vξQ k?!Sz(E[#{TN(6aO:,LTd+ٗSٹ>֭r*+T'QNjLDLKM˽WJZ*8o9V9Vs}6֓:DvØbNpǭT@9xvm[˹d#u&h /T* Ξ8")2tl!P$GRAAEm2U7:!r:$NS YD6y =F˷]`xeZ3>Dƕ֚P ?r<1S!FEbRR+*Y,(@u1R͕cH(a:*eHYt1*-@鬵Hgˮ2: jnۑ|DGIr 5DT`!*: 6T Cu/Kj 0keIMXfoKO¬@֊D$[m:"xA8/"!EPIv^$ ɣ=X)cL;Cma}L):AYz;:CrȪ#(R*9ku %@EA k#ʤ\!9r&T9W}V~ NH_':!Ѯ/r#4fYِMakZHӒH޺ƨY (؛jA빰Tօ%yR=DGjGCq}k!Y:B:B/̌J檲+"5*jr?oHR2 K ` !hne=IB81Sx ψ">10%vek˹ۉJ'IԦ(5S H%1Ҡfd*e" hA)$Ts]6AP82#,Z u`"=6gڵ~̜t6|HOܧc.7[:-x>jqQ٨5ۏ׫(\\W&FCd8ڱ`I(\r&;% *{CErY#Tc-cjk%sĐJ1&DcU]LmVP .|:^G8]Tsw.,G5vނ|7%Fqq䍑!O;u{>8,`ŠkFף Cw'7a/Pw1_vt_vCioXnmls&:`܆dR@TVH`:AuaЂs С7ٟ5DwLv0ٵ!Mms*"OEs|>p4"Dƙ{mewȒQ9;S&%Gi$E QԨ)6DZBsRqQ\".*FNa) X.h[y#wH$9ŵѾeBmm9T8eěZg7N]/ܫj%f}. s}T2g܅#Ea%Di%%r p]޵6c"i7@y950`Y$C yH[hYHrz9,IVɲLٲmJ$SbY^vd++%=)hMB3ѪkI:gXm8#c=R cPUBX6g~[qG гY\z4 &F#MYD6 7$ iUVZԒ:稄6#RJ`oVRݔ 4 E( AHQ`S%ph2v̺,%p:ǔwǟVk8#vqM<Ԯ;Em]u=ݷK9#LḰ%!'i1eΥbYJYic) 9e25X" 5G( Y$:__pV_-x.Xm2"{Dܫhv,灁9+K(h{Bw1DUDt3 nH6\))80D ƍL6FҤ (vg#bPe*Rslvɑ(ye\t=.7{pV  dIahHYl]A1)plrKaw?w?'o5rs{?#pꭺc+'CׅJ!9"dž1 x Z@Ed_KN>J8#򻨥nsxIrb}9E.X"fHuJL(IC2X6H$3t4ݱ66w\ ˙V,h:YVR\PHNnX*}i@ZU@TGW: 3뼣nfγT"F:8ՆTiWݍc/,$,>fRcPVz|$HHW$aIOR.3TV^Hw9ͧw7*.Z9!=Qb&GB|Dܙ'gTޝQRywT)YbEG𭗊qWJ5'-GBac^˻|1 ܖ)x ]69y "-+0Df,vC`[9%?2Zj~}bBkϳο~yv,ͧ]+b1ӯFik57 CmU4vʫY RWSbDJ FXthY_EDuOjޟgVVYuuOfovr8|nQlWj^z@79/[s܊?ZΣ/ޢoRX ů$hMN>1.N"ft@*Qcp[w#gN3ܧjpth!4Y$рdMVx^L`{)M4;s^g0 (|?JWFfWŽ>}3EݎI||J+7Lmd`m6ɮ_'eܕC}s=Of*'7-o5CbYX;F7ޣTDtjcQ>:PJLz"avc $/ o4go)T-J}9[fЖdFT"MH d#"I0 I/1R(J 42nWJEXەp7&NKT{j(d<ڬw.2:nI=8o|S/m䜉/%\LPMdFf26"}*»do$"m4k:<ӃKfo MgWoT)v2$b6fU>ZUm ߔk.񋾬訜ԯk@d * D)v+:fܹޮ#;Or6E"zeLg'Y c"Ι8Qh1tv,ݹϤƟ1ϋB;Y7Œunw SWzky4kYWzNF[ u*1:im@C#Y݉Gɳ`+yHf&=Q(VuQfQ ;Dz6NV55dP:#+ ~c" l1^ݝ/ {5RgL ’b^%͑` )/1s.8Jid9 C#I 3tEd1CFM9x$@ lIl NwUޙv#}kIZ댘|z\TG\`ݱ׎f˼~Ct@1gM 0&=%{ 8!w>>t'ӻ,RVNӷjٶj۩]C}XoG'Am[CZW!~[[hper}^hYgGzJkE |rsӺ[ )ВqsˆC{^~S jQ#[]-Zoq B%byaPX&d߿_IfeOeWzD~G3igWLnF_94QWdWft;Rw/inÓ0dǒ1̴6z ܠQZI& i{{e3Kۛg_TwafҾVi#b@9yc9ׄo)J%{{R7:qzJ"v6J5S5(m dFʀyqӧ*94&j\rIq\1)Tġt׫Ԥ:(TE8e":Q~O(uO}TupO~~・|)dEG Gt*3v\^pɷ)rr&BS|IwrhfXlE?̯o>$dѸE]nm=̮4zKg!iy]T4x݃żFf/{{O?=pG˫J0kj=V+ٓН=uW7:Y,? bOZ\8zGw=9=~gL{85W2_|p.5sAsXWB-C7ӋFMtZnlFf>YF]|㛜ưܜzSxLO>ۯG_c3_z(yO۫?|ɳ߾xۧ?&|/x-Ο鹳&[[|tz5.4w|i 6vɻ6~ѾSaSOrAUwqô${:aasW`6jjɎi~tݸ9un}ݦ!n& Q ~֘Km8_''Zc^'b-i=/fzO[%v:#B6%c6~ ಞFpUd az+:]lЙC5.хD#BkLAȺΝhOhчkԷht ۬;'ʳ[w4o9GV֝ϳuG#(铭;G;6U?[@ٗ'Z.?reҽW/&'Νg]z>?s?\mkӏn&/}`3Ϣ4jR+^[`_gZ3D[ӎiN={Y|ϝЋKo0ĝGwq|j;yj@|h`3/6f=p5Vg% 3;(S;< vm_|x i)[֔_ƚQ~ 75Z{{=c;?~*uԂDus1 :)xÙiM:ŒJӛ<=úKwm_uwg_ݍlYʤֈn3(Y"*{sב_ڰW;Xr/{>U=e| zߏ p=R1ՑI1dwm|.UZ)%V=^7'\b-nTvp B~PnLB]\˒S,Hڪ~)0d>ff<?j;0x1KS>ۏ~8q8?=Oyg)fG!Zo; k~++\9Z>.Ϟ=sṕg!::Bw)z4EppWY=uNm8'Anya\ 0{ߠYftyik373S^ҙdžޠfitX|զ krtmf%0Jz]>fΖV9ۉ۞I=TBƻv{/'kFnoͥL '=|K(vSP%egMcl/Ng,i"\v^?)^*A u֪ܔ8B.ڬe4s#εRDKZzϽCMTDzo%[ZEg$T$]FKah;ΣdsrŇbhhSt]G{_|i j+0ZjfJ@>]$\|2wafhOh[(:TӨ%gC _ x3S"sV߬6s˺8M(K66Ld,4x2R Fa_/1!Kh*gs#Q={WߨGJ<8„KZK9Ty~U[V1\&ν֩ d*X J!w GUA2țk#8TG*Y76E5GIl1[pYc;S\2֤@{ur38:Ahkc0mF 3!%]v9X']4/V[ZQW!5*dlzaCKU+?y%E6k,8Xe dGo͆HudV@N0Q9 _ E*UsFG>"R(- O9g.5:|L!xq9xi@yGtaQ7P߸)q {hbu2\ck(|-sb NuhK;wGV-|JKiF#.+ FU,*;\@RZg=ak+C *Tx@5w˽AAQlG6EPL1,i@;!np* +wIQee +ʄf$4 (F JH]鬇-P*ӫl5c;H&j! ]V۞dWNc@!7CACLAAf( Z >&%x&T+4 ~DUuASPb$謱#~fQ1uK:)Ԛ8c@9|Ȩ ռ? `ɦ= p\A ?^5Vh│DC0GRF5U \!Т-K* x'x)J6NZGRAAuSu Rq*Ά.g-(}kWGAyǒ_5Nh_*}=\HjqPBYY֚ M2+V>,{8M23.]g'X3f1#DUJ=IkB /{ax \'k T xV2GH}1[ڸL}uLC=GNƺ5%t_xOw )x$Q*LFȼb|b1;3]7h8=K̈%+&CFPr#VH܁m CK.,d!:T?QyqT'dP'(N |cM0Lv_>w%|4"d*KqSе4Hcc.u%{6@^,Z`ۢkPK |T@PF0X>:yH H jcE+5Qv5+ڱ-" 5D7HX] @K,brjFz^pZF!/D(8 td+.x݆ycm \g 12iUO_Sd@!jPk;<젇XT&.*b,zP>M!5RYj-Ѫa_`E[YdP55Y,p(mJ5[Ss$TYc!-Z퐄M٤a`F9KPa\qj.2ol@?X<|b7}κݰ*iJ'1se:," h36{hfѳq5M^vV8Y:u4kZsLڌQg5rFCk bLLy*{%oH#t9n(!/{úSi樇*tyF0B{C좃Y0mEAt0X$SSAvzDGCPCz uz\ ;ƾ:DfV$5sJmhB g 9)I*[0jH 1\U FZ*(.FR<z E-и `*6B?&գ-19mp@[>^bEJW jETAZQkLۃd &S@rZx ږ|z痝 ) ԟy5 14.2+5ZxXAҢ5MD9Lpy #Cׅ̆Դ~Z) "Q%q^\4 .Fr Kcb1˱XT@P@U|IA y>tIJK'4\&!K#քru3\Xx;jD@'ս!\7bk=uJH/2!^>HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !KaH QR7$Γ@2!D|LIH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !裭ҭz.o5廞ӛ?~]{o}w{xs4|b+Ϫ YVNf}fwUN9mW޵c"y~  l?loCȒ"v{X;*I)[v XEXs>_5q=wOj)! H?Nś?:FLJt%g" V ӯX7Ŝ/HRz}I$O|Wh~zW8AFв"1?G/̥4OS^bֹJ&stW^5%W+3Л,ᯛdO74 o .z6k ?bjO0ݯ%gkɱƕͯOFk(`Eq gTRjN]b(MP: ⸬9JI,V$ 9.Ӿp\H2y$X1f(sE\K*~pt}w!/mvPN5hpw{SnfOǟWЇL|Wm;)73 yڿOyyU)1{?©xeA̔~L>匷>?l@L&Ψ_{=吓EUwiD|xYBHMFQ{0\]\U%QQ Pbvހ2ޯZx~KݝMO_} ?ϴ}=c[[mxw}oteb~uo46v[-_- Ϻ$0ld^䩜JՆr"!#.$t<II/e16A] c;q,F%i2ղZqDJd4bNP`ȘJeQf.̴H9PAϡKGFs(AX pH3&-wރshOr̖:RɶST^Yp{KuVoEv7I㋞^UKScC HT@:RFABSU r8 "S)GnMڱN= 8%!jG\*;&iBMVHbRVۥA) ((ˀ.䅂M4V6ެ'GNeH숷96;Yj=S`{nzw.C;iҲo k9e.๎Q<@5WDm8 mm!&.(zrIqt7ƇkBp+=%XAc濇a *c_v5϶p۞#^X gHaHGJ9ʖ(hJiR!rzL28Wa މހOFYnX]{cG=pQ|fe\gw|5(e譋gN1GE ރ"X;1ǎ9^(s P'lKYFU.9n2D Ba`uMJ>%2\J$1)d p8"AN]K+j99m[`xoÂq/~u= ^kh|k-wK~aS .n,yftGw=Ϛ>[C2fAt?&}B.~8 `* n&Ntjs8rl6bK-BZv~'FP_{˫u|#8.`]-/zu=܄h.H<'w|:)C(Og3q'*!ZƠNKH =ɑtƢP9S:wEwTT]D:x )2h(|녢 JeQv디|珋=5 {R< 9&3 DA^*ZNq@\;pXK*(bvDBϳοyy~;y%0|֜ɽ]Yv6Y1^}&oj) >hQžUFU=Tfx o1p5wl-W>5%fM-"JjozߙMӹ4|>so^m^9mG+ٿ;Nd5dt2-_O1&vtR_RSQ%@OHBp.d;LOey S/LfПeZÇ?vR騿%-_Z3~r胓a{J` Gbm#^XJ!-Ԑݖ䕋K)oUp>@1ٱ;O9}$ۿV,H;Kt,X @y-, b%2[g>Z+#O2+`TQ ̗=C!"6<EZT/|eMdZl|WrhdB.]MUVc&wmmK xq'=/1J\SLRR$šHj(%s}KU]]'^=2[1t>AzףKmOOn֨ "Es =* J3KtLAyA_{1M屼> J?u{Iw$pbiL`Q茍ƀ ػxEf)q\ lOm}@fdKnK9 u (UE0( aȔ3f 5+HQФ0:ɀ9":0G$i8pDX)C8_6?z:KDlvg󥋁H1'Vq6R94Ic|?ָO$ +‹{e&O Kl(b3ܧ<}Cx>t:X12$" ҹBs uh.m[g4%x =W&?04O xJq` ˇ`麳Y/Nq:ȟL_ -2"V VNNnオr {[-r %5rJlnNYzr 77ŃW׳l`D|iv=(ז۵q?]^dŹҞ=]uCᨶYfyCP(Xi'{& ={A{Lڹ*P^Gɱ&2:> .5}= > J=&FhTO|(Ǚ^];N_4ٿ3L^ v`Щzx0{t_۠kTߤk]䬫_.+z+au=`dJ?f]}?wGenI2!z^e$^H3? 2W5Q5INi*DE7S6mFGIlwn'I땳cS:;aNSgoYKCT){+SIk'=a\BzG۞3HF-oROS+:n(*H1&9㌁ґsFuFa|}/Y}E ж;#SXՆ(`Er - sct!=8F#C o#[gMqwg_ȰUrY:Kmk<}0J7 A/L,"*(+nu~? S<`s-:OMcʠZCykpǍN&&VQ3jg>L(RFXp%[?ȵ a=<-Q>r#'@tIj ?GoƎFG{G˅p'ra$(!QR7egY x8R,6}JƾWt"]l, L[F>u]لW%ucOhKHZ,SÃ@tJR)Ԓh5N4B{xŀVy$F/eR8L:WS  >bP'i#g}mkNk*Hz3$.Kո7V7S֌Us |6Y-b`xxmb(iT7ʃrX7Q".ZӐ U=sA,ij~4먶#҂ߏWWE^U=iۆ*UԺj(ĕ׵@!5p ZhE]M[Њ%tՃLqfO 2sysө5z!90lpLp%4 ƌ h4-$ uӾ;""nY]3xůO0ŸGMy 7v_=6K_\>y4 3w!TlUG9f:aa˭'kz~ִ1xǯ'U)~TM$CghbWMl;:]f4&g[?}.u{:~[2J K)N{NlTA#O 1STJA8Gsm<ƖhLЭ?hj@@w~r]4 gO=&,*cFXqD@3"HD8x*' UzQz]<*OycU iS>JUq몽ͪ>Cš RO}ןޛn޴Z޴01{n,1;z7F딮SކP'{E(H7s?h^R<ҼH)Fw<)&7^ ~Y 4t}z Qy0LgnxxS8$WS/̤^H余ޕa_[sVe'Փ M.f;}4+ _R]@woҶ.kagE.s;@CM)cgU o~V["7AmMNQYbR!T^GNɉ(r+o|N%Bk-etȓh<^dN'nyslJ`RGSJmΌ6>X7;˜<[M{%@6Ŝ߿ymYW 2988!1[*ٔD M ^soI3S!Z=JdS0`Pf`8YSIi fY&)"وȂ6L'' ͸ͫK$9|T|tMM#Pߐ p ?[MYAY[5y7zو!%bnĦR!BWv0Wɾ5fkl15TA3l0f3`&F8&T&żn 626"xpf#*kRM.#;)>g9 ,bxX 5V@iTV_&+PlM 82(C:q'Yyn^w~QG[]hF)n% L+4:0.Vϲ;[PI683㣥8d;aVfo?@{L-7eBPs䣂䣂[bDТuۆwT5Hd V;NL4XoJ_T+"<JI?:|hPD)?J `h[FE';-U֌uއqakJ*Q;ZBJA[A_q3Ržht8lJA Wy~~%"nɨbB9R3kaS AiCQ1d2eV yQQ|q`D, 'TBF 6`}\#sR(Gɻ(>f:7.9^K'֖8ΈVh 1聚:HF+w"QEp,xަkMzɘ+T ;ʘ٨>q'lDS6Tmdcn|j3ە ]} DP,R>ګSMP)sʕ: kXUN 7tۑuXN4+F>`$^!nTXr鼡>T f7ZsL(QmGs+:2,"AqGQ.k%%RHDclACszK? #i\+C~lz}&2غ@KG 1b gR41/-2Kz0<^:t?_5q8b\fCSrڥ~~=tgFB0ւiS" , dB$D4@DJ<Ñt0X1^I ͵w|n;khG7㝌 #8yC3Jb.jF)om:H;vKM4*kc&PfTetۨQ1,%! j@Ca nl8KR x8cy`禎QcS ,Ȳ̦de]ݥc(픉$TŜ!sc<ϝRrl-rSr4Uj[d#A&2⨥NA ׄ!D!e!x0k5f,@Fkwm$Ilvgfi1i4b0,eIMQve%DJٴ0,ˬbUTfċYT RHL.*ۙwvgqBx9}:n㚍 /Kޗ\:;_?Qnқ2oM]F 8W]{׶{t4?νsl7/~t廓w_|Yo+o>oz{mSܥ4v+=\^|Džj3&>xŖUsSw5ވr&cJ]_b/a7"Slebs#f9/BFb"!;H?e r'GV1>B?hNt w;|ҹB-RǶ7cdAfLY5 ,;>CJ/ Vm22(%Bdk²NAT\2s-\P"PCX ~˅U#q y%d9SgC&nC-g~)2{Ml3y+rɫ>ߚ!pߜXEY>˯nٰ߲٧/@~U=:XŰVT:6S (9_g-PV^W/*3v՜kյJRzJkh7%ZAJ:5WTod&ndUaa78 Mg,',|V,6g~ۘȖ=o>ܤeF_0勋7o}*!t@MV|V&9 &"X0{ &"npT5{dTȮ">TCTHK݈.Fy,M;Em O J6,b3'I&F_%A&KH`c\a1*#Q\1+Vc" F8`M."YUO8jHٍCP*ҀX~3""MH)tRF>E <Q~3@18'A "bR4bb4PL1yܤSB$#vRVƾ3"vg7"?+\Ye.Zg7-9Ivp.>Z4W*gMB\R9Xjb!R1f0(Nx,xM;~w?aӻTcIp/w&Vs3< %ssBMuu(i(-Z%g/O ߑӯѤ(x.j'S#%:I H@M2iPmW0)#h\HJKFYUdtߌDK;ܺh}!*!XRNt:`2^ &NУ9\A5g6Z2Z z)*ڸW[QoDSn&vh?v rS\_hE;D_l() `\HE) 6Z2Zl`rY1Y0u}ܶ;pLE]+SGh/:*QX*'ɏ2,*Sl\-tj/A6%UW_Bunig] ٠WST *'rp 97-9gm%R)砘;7pÛoE@NagmbNJ1()i a KLNAR?8'% !DJ cF NTr$">q.qSڰNJ_$ ^QT^%EJ W_)j`{T҆Gup~+"vV&h(uSc3}CS.\d!t3$T>da* ]r Oח4->Kg%B^TlVPŔTsN[wg ZS-6Vez4ߚO;?^v&UǸ[Z]z k9dgu##3y/e+>}8[!d+|ZAsEqq]1uIrۨ:wYG ^c4Mޟ5g!%}76y@A|S˫#-$qw8G';lՙGPY+zq1Rjk fy!R[娳=1/Ĵ䞍i{X_ϛLRfS}qi:P b'P,!hDB6rLM>FOJ}/imc>(Y]H)XrHbRdBv"Kвs?l(tDE:Ł{G@3hHaWt~Qɀ5\O3Q=>AϲkF<*H ҳv rG Q Ad@Ze˶ 'ˀmutog7 mxMEw[\ۤb&8B]&]&KD ugOx}~3}~P]97K Xj*iiM2<|9g*g?uzօ(o*$)h!ɠ#^+3YyZL1NQjVk?Ae6+()$KUFY,{n2+V)?lυ?k) %]r |/~bQOb(ѿWO /Mh^  c,ty}8~"8vگ&__t'L=矫ʕ,g_;DقL4RX=TyPP8Ee?Ҩ)1_g3wp~}Sl|:@>>G[B"$Pr"Y)eǘUʤZ_Չl9zW;[a!d[i6WP0*Ȑ8WV5*nM ׶b,P兏^R2fbN .ત%VD JEH"oo/AtVcRFH)ΪV`R8L[KIA9=tindfpj I՚І]'D,ζϞ PfQ_DI%< M!*!Zó[l֊ [늩X>Cg`[?qջBu3[_ _jSV=4`bvz LAoy̯/8,J5Y!B#P2D6ǨC x\Ҏ(rYUNVeN\3FD+I,3f[T ^TkrmPB&D5}d=0i*F-AJK`{í6ٖR|bLHMǤ -rVؘjp)BȤ[?0L(T6fzTXؓD)";2a U"EʕZƱ֢EtYhG.#+ .Du\h;\]r~;aJ8 Fբ6a6h eOak0] Np sVkN=8ي*28Q:r]jm,jl %n2xGjPmVlc z*0NZ:i76`*ARKru{g. J+OE !4WV) w='d&UIP*h@,.Fj$*x & Ek=2GJ 2z*! v2>`T q/)fa :F(ƅ5*<*- pj r[gb8.p3SQ T#xh)B6 Nr@AOuD@?{F] vT|HA2 2aHcS͇(MQTk,auvuսT9WDbkt]Zʦ#'Fc5J1C0}9c`E[[[ňq2mŬ*$5࢒ŲR;L'<aZ v&~ХOF5eso#Yo]:ܦ:/x1puᗆJ{dtdnIEJ rCi`jc1;`Kj, tĸ0FRi2ӲB JP`H燍Uh\?{ " eujBFYuAJL(C{X !QBG$h(S`qΑjT%lvU-is^3[ZlB ͉E9 qإ6F`$8K-F|@D-;`wDd !ܨisWcpcE*CDYXs#.mtԞEwf,xmJUΪrݶ2k+^cZz7*-H s+Rt@_kӤE akC:[h:8lw[atZNWo.Y\ӐIM3Kx0$ڢ+R7]JqL[ [ ;HSv`)jY|u.֐Dk sQS8! 1G0e@_ iDcuR"ʥP4LTzT*|Dpz;)Wԃ0V2OW "U.s -pqe2H7}I5w\ Jt0ecj2Brx:^bւDL>QWG ck8AF3,dyI%ZBLQ;I7]|l]ǽRZde7UcM, euLK44/2MxbW*JfOn0:/J,V^ T͍BNXJ5١닫\/hŕ<`thy^iRB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%$RB"%gUBVh5l.k>uZ_\FmI:bq Ў #'(9BJƑGH'wټaoOC9AOyr0fEۂ:Ki8Ik(媈 ߖam7չbDjv)c2#992:Ú`Tc3{*nos w_N4rG>LxeIr_dk@؎LS )mXWw!u>^_U4QV[HXM 7PtVA2LsLsVFQeE 0Ghcy=_c~w>emض &v5 ϑe!i#en:c9c͜,H`o Wۡ%2z9h sownap[?K>3~,\߹Au92sPVhU)8 5pNP`riCg$3_\z]y4;W yT$Tx)fCc9Yl nl#Gʗpmo.tyPs7RscAU$HlQfc):d^{8{gɍ?ލ;eq: }1`q?t;rxh;t"{7s9O3?\Z;iGnδ')#JBro&m'n7)7)c6it59|]jO8O?i 4+Ս5kkKiM@l?OK;zm?ԶL枽cq7ޟGZ7 RNֵ;uyLo[V@seRv|Ƈ?7ܟZ_hHW/v~w{X~GN=kENl ԶZ߅Z|Ja'!!?9F] ;󽇚݆RglJM&H*s?wɗ0H҇n^DdݛH/OD?Cg$q? xZϯ W/h`}yϳ-͊ϒ7i}OהGlS{DMX,Zӝn͖y{ZL?~MJi.O:h}{{ppO>n[l'ϗa} h_Mr^}y~ؑ7=֏(6VϙТoziBxԳP0G$7VUY')u+PqSo{eNY˸)qETN$[k ͤ=mL 0< Ed$ִe(*JL3&}Y2_,0d 丕85"x4s/FaWNbp(%!Ou'[_FV 4.=F`j1|B6yhbcB|)m;'f 80e>c\SU1Cz |=v> ʊ,%Urd=fVcлlgDV|嵦`ev"zUlkۄh:K~tD1hsjy}m K>I:*FFÜd&Ҹ$qФ h4FdIθlTJ;΂RH3vQ5W[Zh6Lѽ,y&?og]ihwƴe->yC>mB.dZd^jұf}oÍ ͕~b;W̓LE]N-SI 5$&G`슝]}.YX.xly '\n}m7>` J6bx+1?{WѤ@_#CHd/b{H2)Yv^ouG#Mz@t*p0!waי:T24OUjo‡dCɿeKtxxpy~q0QISm)SOЄ@jN |)0:Y&L cj.n 3oS^S⻫惣yp3T<* ^sKln6H@ժ_.31ѩ{ hm-5Cڛᨵyey чjm`ͤn:0t{A:V&{8ma#^}5s}q1In1!4B ЉWUu_g?=O}8D}t|3p8.Flhǣ_Ѵiho4UCӄ9f—۽ƊΖl,n"Sz\z?]E0!@ˇ%ԟNOLJ6,b0.\ِOA7UOu>b= >?d}3h^q{P.SJ8IZ K|p:x_ʱ8D_j%q&87&ƥ̫mbrWt`2I68-:93"I>)1ҥTʩlw˹V1k1.Qo&;[Ε:=TnJ]tJ]v\iSw&B)PI[ӅtCJ%R2|A'5X[$tȂ5-cHčIK]s[cFJ8.UZşK1q@LHwQҟze}e=!YJ*H1MZ0/'|66B,iKx.yk0>d_Iv~D_U^W-%}ݛk|1yr-\/볞ilPsPT tZF3~Jw#A*rﷶ("glvP% S Jm%rFl=q.bX<3Q tP``9g\3<:U~wB`Gm%BDX.lRfjgۋ>B?coxKT'BL,[,Pj 葌EHC9nx9KwЌ&鵔Ӛj~8g.D:+.3QOXLr0Ww~>r&ǻ_UYm;IOzJ%RЈt:Vj.rCA!+UU5x{5âVޙ /wQ~:t7QZP%/5Өe|a]aǟhnV`US”_<8ir]#_TO׬a<"]&IgsʟgEbf(.ˣ7}̐kT>dJw9?/u{ve0gj.Cjmp~R7:>0ws/%epgaG5F9Ykг|T1) |Zzy捂H~y#X4oeܖ{{v6˒,_Ne;HaRt=l t˄ M0&˖6&lq: R){MܜP6[?oqo)aIJpg)u9,[X-ީ:z'ڍd'ɮz=gX}'@\X/fdra.#VUοxIJ8Y(o▱kFUnUA7Ae ;WG9wGK@(3YOb R4%SnFI`β.Vwima@s<a[&@s$]S`c8GTqRǕH8kםW>5Wŝp"#65s1/Kg[F;;/{}tm}ρK"aqR 3J)1/<:Yx+zag_m8H +z"r%ýyCi1Kuti/tRg(DE}zۼ'|W!'SuY~8Bs?y],L|NF>#0%jf@ijڤbjl*ovř*ݘ灠x>_̛ ϰ 7[ϯ.ff<)ބaUU۽MF@NfqZƸ$,Ij.5y0_$¾-oICoz9Ides`bCtIQ&9lw'ۢԓmU:U(~W; #//Djy?qfy|JC:$ nt!30;!o".e*NDyDI%_e-jmblc^1̉i/h Mշ꒔uGGS2UtGMa|n?!DM;a ԔcֱD;wazitjN҆Hq4,XaQ( ASC9%vg`4ՙׂz"c6>β[xG#3>e8޸71eQ$Rr# *qe]HO2'zY]&ʾ9>)|Y`";*Uj]%>6yqa2ۭh cAyy%Me $tr Qy{/ (,D耢DFtI"<KCo4/++p sjޥm\7`źW~~wm{"f۽g?̘Oÿ^#9]QKX^MJ @^[2Hh5Ju_%ZWL㸣7zm.o o[y[ʷZ} =K *|D !i R $)`HDAae9D^cxBQ^Dj)僉KM-c>z$RD{&;Gָta-"it-7Ti`31aYIԲVf#G$}ٻcz\kKgD+i4ՂDD;aI"8X<;^)Ѡ`1/K3RaҥFRV-,[ܲ V'vcW@ >XbVDdQR+u=&}mEl?];[ٗ탞zԎNVDBH$`NJT*1ylylmX;!Z LFXυN()I' ( e5BI`,g!)fǙ W՝gTv_)6ܠtUG lv-;9{YHH# N( JsFZʜV +(g8z!Ay #ctr{[mZ3BH +t$R#"1)X,h(iH1Ju \نL4QAJpDRQ 6Pd4y֝lj"ޢTE!*/=A"SE1^kkfL*<7ĘO%sPwY [ͅӈZ$ H(CEa"ksD5Ah Zl'⤥ZQ4! Fs̉9pr>hg8z1[U7LʂCyM#B-b< 7QEj)T R WIc:[m@߮FL ,Td_J 0BB7n ҧ@4Ј@b;']Gc J75/ YwuW7Aɱ:sv`#e~lk֤qikRJ)eK6H&B)$t׋]gt!Ak?$~VMs-qf9WUv:T鯎体PLzn`\ޕ>e1F: l0 9HysQ 0Kʶ6׆r- i/o6y'P׃K<%a.;6ײyUU J}gȽ]Vxiw{m {mc;/&Osor]T0vGA9GPvU(HK11\ $Ϝ$㑽x4p si%Jq]X/va.ݗN ޶Kk]:]/_:%Z+ ` F k5cr鼡>F&r7P1G EέRȰ, Ƹ#a()$juMD傅ޢjdCtD۲*/c+;?+^oZH&Ȁ1+;<7\Ls۴ GK7xo}p#̤PAćv d?~:m 3o#CQ!@Sk44 ĀB2M" s"q|'wr1k:qsy%~yYYmXFݐt/\nÆϓi,lp8`S[J_Kt% m2"Nh8+_T.ohz}?Un ouCͬr'tپZ6ԲSݶ_/W+-C17m~v3wyoEJWxZrk>bF; $Mwڐx6Oi-ꪳ>p&Z>5PrܑgXbVM1|P6^:~ȆӃr,vF?PʮɗԱ0]NB%ELU{1XLf$:Zm sǎD`Ck,u͚_75cu=F%9e2$L2jf`ٕP&+ߊa" !81ʌZm8;29W])d4F'{e4!T"kf49%a= /XUC^K<{(tNkQ +H"r~$%tVͲ,{/ik2bZ̒ =)h슻_B3k;gR%c<,a(c_Y*BdQem IYgw4@O>O܇ş5h`0'W.m"i!U(H 9G%R{K xfZHuJ+32eϣ`2!E.!ShCIce.9̾vpxHm<,h+*HjԱ֕v`uI g)t>rB90$"1m5fzc9dRDUy6>8IȐiQ&CZ$a&H2>BIN׎6a/jx*}+Meh:Iĭ@1HQAZX' t, $ޅ pU%1mt@r -I3CPlndc$MZp"D, axshslըdO(ye:ŭʣ`@$cH]dLЧ GE*d,?HG HN.C.L6ZԱ<|#9"_6'%{?1{lzP%Tp<4k$6h\ ^v[ X6{V3i +f r\x d̒ }"!;EB 뺵Q*+>%."uPtMTIc:s'[WΎKizJI+.i+H 4,+Q b 𜹱<Tk IlUDkytš810;il.0 nB| 3SA"Ԑ<$RYADg$!$mww$T C2#EJSD{l։7̫R nXZ:L(C>}0P0 E|tyV;fz~^E:''6 SlBtX)C;2 *N"C "9K(R1NJɾQ(v디}wcpmImœEnәآR CdV{)jk<ֱ\>i{FKM/Hhe>߷cm>emO0dDVF~`ny/@LWSboy_:leT8Rd:hYWHuKj-孞2:ʏ@v?+~^OxժuK0 [,Zu<헭ޢ/RP ůAl&'tBD'PRN:a } Ȩ1-֠% 'L;.TJo L* Xإ s="-?z?0m{!ѯCxuɻmQfϥ95DhA>zAzkPmI1$QDkW udL!\sYRQJV7 )*!z]'Eq:W{'%io}6:!H)&}.0`z4H˓]ꇧb^ztfö_ӡl KpQ&ovX\TrVYfqOkkm0xb|&'3αpuu3S%r5mp I 8K2NifK9q?BIxfړ&zR`es! NAП2v#g!.+ѩe%C" m2`,]&Z(eR$xdx F`*q.#Wz+aM "W61BK̖%cOR*꒩:APzN(c&Vͮ%KKCSKx/\831#垫XӴ7iMC)q"I< >~, 3L|L&ǗœֲԚ}'͌5imFIqK@algWqd.dJB, \360y ёDO2tp-pgZ~0}P( c¹Ȑ7ޕo|heZKFZw7,Ԋ\iNӯtV\:<P0{ks͚65S}ӋV=ݮ?̃lm] i{ ;Hïu'7.7u#6wnV'bDLQ+h*|4|^,emrjbz8Wkf^'=-xt|: ɯ+of_1c|K5/ǽؿ r÷??}~x?>tV`" g`EGtjuM-vZȚ/~$c+oYU,qyƵ,׿}=oYEjx@ܞ^3T|{ rv%ҟVða5EVe[b C6:W[USoa-vd&'Q_j$Ơ$gA`NFOx+xGB*nOٖ\'.ZpiZMl'3@PY$ )*$mV% Qy)-F e3EJ{*:4:hrȮz;{xǏWѲtK.N]!yXQDqV/ QWa"/!th9\{C(4A"[Ro0*{7i؛RrސԢ^o۴ȭ=7mlo~~nK}6[T.XgKZbz 1( 3A*/ڦ_C(I2LM73a_)Ty[69[fЖdFT"ֵu`;RD8.&~%ω(ɌEdXd1o杣WRYb>Nd+Ni0U Xc0#w<0OL(KJ:g AΎ|y#Ž_Ϥ'/]Cϳ7 9,Z_MO )}ŀ?j1$PT۳ 2O`^aċ ͢ҽ͸G%43a@q2?h3è+=ǃZ& 7aƏj [=ϊ;]~`xa'o&ޑ.?~Xq%O;Q;Z z.=b;I`yǐ4$a$:[Z j~u7fo1ޣt>!ڨ̑f}''t7RtInWP]KY~hmoZ+z?-s&ڌ!C* qC{,?Qԁ u QHzkZ; 6Eaҵ[X&d.:+zC6[t1q3.IxsD %Z{Ziyt8[\߉|Bʗ.r-} #^.zqXww;Kf6}fwD7vwp;SaǼܓz>}[w?Y i㧳U; GA{јxKhIGʱoؒx lq1slg^+>߽RiW\3gQ{P>ۄ,"X B)fB9c"!mNOy58O)_R;".SvnǍĺ~|.w =bBT\%ߗZvysZ1]3>z]~j1 `)Lײ&p Gf?R)Fy3wBgɲe8G-QpZP&/r⚡R혙eY7[~lҷ}Bgʲ3qC;G{ ݣ.,/9hG/.~/Bpre{,%Q H)Et,jXR>{L&Ҋɓz:!q@:Pgf6GJs5;5x+x~uݵVnq϶߀Oz;ѳD֍zl9jxGRs|.mUiRr)ie[)E/9x{R&,F4CePymЕqg5vjY+T1YUNXFIF A{ GJ7;ΚĕYoόflsMtoL~i_&mzJNٷXy#7b lz>HxN?/S Bop攔Oo]Lneֺ`# JdtqrtW} ]bs {3Ho&tmr^b1$SX6Y=zeO~n!JSp]OL[tKC7-{6+.Z&[.c)k%|.i=}VRinnaUӍ'}6yȓM]_J%Ts}:]l@Sߵc}eDr ^6YP)[ɝu-c w͵+n[Rٔ~B.ZU _t+pVS88J&]E+#wC []_2OySϞ}$Y/WŽ%LO4W?}W=K B\_CxۈÌOiKG\l.T3ˀrm o$ÂӅɩlp_iN2Pq/p6ȘE ^Hi3٨"\et&$ J` 5iwrd{zn>`Oh_; )o_G1xΠ6kRRGz眴&ZsL;ƍ!!Im,$3(5i@R&d~W$=)Zɉa6TVjWԶkKa ë^^=lX/lguXU|@fe֗9)ffD`Ur~Fk:(_ZAXkښcYh1dzR*3TmR\#.m2*la58ʶPśW˛:t @/nO>݀]^^}._Ŷ)@ަTEA^+$!d9(9 v6ꖫZȒ8{.  l']ʑM&ߎYBbsHckbgWbծ&Zm]jjvL"rF͞$!' ƴ1 goU,DU{6.x$'!EHlML,h #(Udd\j췇SjlX XjqE4--hwznRz`F-I- B R$ߎ,w¡/` 'Q"i#7NJǬ hhJ+K̍H64 A +[j췈.jOvq\:Xg5)9.lq]yhpV 1JHJr>"GDg_#Hh_.=&{?p ݟ6+ n~4ُ9v/V~$]EacCTHp"a!IǗK%;?oku2G2Z+0@QNzERAI>K^jAp a0kHٳDȌ7A[c䄟SZM } e aDȹJzNȒjD6drRi}4ʹ0: )UmPjiy3C6ϹŎ"r0Zr;( #iR@I2$^{ J!qVIȜAè4=% *<ٹ04(tXkw#/p)yc=Iкh yf}.@Ά?Xx`;B#$ȉ[Y .g)Pd 2!\Z lS2Vk9zVի߾5[BCPХβτm4QdXDlR t."=6(5wdRK3 tU R7O,N=* HaT@)p!H&"GGC)ȓZNR=4Ik=2q1ZQ YgJ)I :e%b ܲ@h8:KLmx?m5gAz^9yld1 P. "M% R9eɅ3I1i^Dx^M/djȩ)!!!&&m:0Θ!MT<h 3 .5Q=T5s5 9V;xe_c4W r84uwDYof@䡬dOlJ&팦xڦgmچdmZ6QFtt1"L\J3ڒoshW&& +kY5VWty1Km/'?o /K+sk9fNQϪ1{ZKhabqHA5b(] t$MLySofZ=|Yw3E\4)7h"ھe1@+BL-Jz56MrkW߽}va6 ۚb-x>p8outa5:k_//SPm-_{o@jB۝H;؎H!-ed/'NEyIF^Er2A,hX=^6rLM>FJ}o۩1EA!银١TމF"W2+wPQ r\t)7fuYK`{ڐ~цA;bg:&G͂$#v4_|_|v[Հ^t}>1@ESk.,߾Rd h )4KDciP#yxG(S~ԌPxy(E)pXQ䯔GLFEa5(6Mn ~v#' s &ԙ{E޵q$2H} 0^91CL"eS=3HICQPm'ZZFbB>Pz-*v ǫOW vh6#<-Nt[ 8Jb,AH(udy:y an`5$}:yhURh8˳noN! ҳQfI \{` DrεmA !.3Y`jD"ȵ9½o]MnQɾݢfzoz59hV ➋2 '?;h8HKU60)yH9&d<ܳh(QA@d%SЖ@pw9; .I22O*PiL7=e-X"xYY'}:l]LgR}MfȊu^1D9ʴ= YbjG-VQLHxIKaQZGכpl1qn#NyLV+󼗰?jtMT%H:-z_?T2*6W|y[b~Y4WEOFӚwipr>YyS$܂ Pq]"Xtl2@ 3׃]?j#lS(VځN)d C*h]"ږ8 "!kc;Gs\#DȔJ%AcVニF, %)HIVN[)OMu3' w=r\.D5~fXBlQ]N*)zy bGb _Oyy8 @<B0QE&C[C@A$|'$"!KH*'t™2_9,Qҋtsc=I 9Xڈ0:z.|BID( ȝ "tJT@Ag%7F+oM)q/ dR;k?V rHvm}> n} L2A=m:zQCğQ}>N$)QjFDsi@BDX/@Rvyw ߜ~L׳U&l4\ Ɇ䄔$g+UFr<o-\\κu[O*D=QҀl c%x}⎥@v^޷<8!g c h^1RPց(@ 9!o"pEv)c 9 hf Jq&892;S1xΒj}a>;.𛖼5/AIEՕyX_M۲i`͠F-]٤|2q:8"]S(corOl49`"B#3>4c5hbD].%TQ;G1Q"uʻ 4!2!#BsN8O?v+Fn0'Ϟ PrQ΂cQGt NBFd$\S}gT{DBt2SE]Kvww`{t#=m4Ap6`S\Pd?ѹ,$9Q D9ǽ 2{Nll|i*nCWy$x '; ="-.{3c/LS*]ٴ@2c*=E,֋CV%JNٶJZެʖӶz٨޷'xOvZR-&ge5 b˵4įjnVuG!}]ye4#ÖF 6j.MjWtQZ׀UKzStOG˭N[fʹZYl5[*䪢q劄1ME ~V#ubCejde[56Wh ͋"C BWŎWtQӖ֭<0–a|8=\S*MKyNW7l$9?Z-&vKi%(䏆q] %vg_=ihmrR)G{ރx lG`Z\wԹZfw#oKMwɏ1Z&)u^Dk2,6#ަlc?VUdA{h/44*wU,]9z=ζzf'g?xBeC *F:ι&撎N%KY@JI&QA(upm1է{f?_PGh|g;\"9'm]2$2MDJQNUs-$UQEƵ_z餜پDvlolq!T[=l}kR>~"i#I@QqWvvIIuດ^RCȁ]Ȃ<3Eߪ/ETYm1kN , pL"2'wYp>$SnY9:J! 1*CN^hS'}(dpA2BH£_)Bk"|RPOFzϑ5VHsxy%ȻEý}r3'q}ۉu%Js)`3d Dn(2"2X͜MT,Iiߒ<1O!"m%\$KrsMIBS zBzBꖐzaٞ 7 "%I5/@δ !${ O n:-ϟ.4 "; CcT|^YpgkP"8IF-rܔGYpevS0S\ ,ݣn'S+cjJu^ßՋO̓7pE~pp[nvIA[m -&wGlm -5ڛ 1ѸQ8a r>ӫ6ٺ1 ZmFhuUqN#)nƓ*W*&b\o1) !tMueQO_>}o^~>~1eͿ5:EvA? ?ЁM_Ѵilo4װAӌw9f¯{sk~E`93Ȝ,~=z~Lđ%_^(2ls0vZ'RL T6t-Ӎv5viXF \i5ޙU4q/9 vjZw.')wRPr[X9jElkǸJИ4%i)浶Ip3QŶG܁H,N"g" ˙9t! P،^$b^{24lO[ -,l{2^ :zӝ'18Z5VN& lE PXyJd:NBi0O2[]oJY,hjDVw+ [n,|]ct%YڣTO!jw CW֧/RIG]j"u qfKI ShDTMikGhT`VkbBA@"(@x :x+N3wu yd|_DWnJ |;|UΦ_%@" Af?sij!OaU)ka@FV<Q, (rtشGK ΟRA!+t9LϜ̹\w\n՗? "LqBTgC萸>DZ@p$^ nsy#{$I/ask;jvͰLSz3X fgڦ#.7߲K1GFEt sФ\:rɘV.5*U#Q.9ʥ9 d|L5s&Z$sQa뉂\ P)UULQ$$.ɘRBmmѦ*f UCeM5K>$; KԠbt̞I'bl ]â*T $;2æ~Iz2&bz=R?)5{/:Ͽq=jxgI𗧒tZf_z1u>?)gN>%31^_~):F[iY Y VZ]e|.[F}le.,q싲@!6X&(/KQ6<5"02; 'iq$Ú#6-/7`m88^=>Y!/N{,Qom{VT]\_WˍܳJ__a$ْ>lI%ίǽJ-{!Ӽm%x{eӳM ޠs| >iy=zl}h&𥿄/}7>GK֡-.uNJN%5=<nkSlo~Ӄ=,؁|W}/~u/'?~*Wuv{_\l;y% VKX_~ &M;r]o¹6͛ۿ9uj<ڂV1V:HChޞ۔J'̴yx'ٶzqRdoR9t0 [M]%.:j*i nq!u.6bZ)sgG6|yc(8N*8.z?ӆ͞>6Nw;rMKܭ*V_ KUJmspTkd1T暒jE#WFBO1Gb2Lc {;-U R#c<ane ]XhXxXxg5! {w^|^䫳懿xΊ@<.٢FlPz!v8UBqk<):mD 2VXt$pQٛbbKk(qu7v%u݆x<IM͚%"v}#bY 'P)ѦuR֛s҂0F$h2A 1j|<gmsJY(hvpv}RU'1uv="q8ŭ^3E#00[(ZJAT2+NR]#&pYǾϾ]:ʻ27%s&ԏ~&~|ju 17 l% aYa wBpP!hIBpϱlvlj:v/-z]6PHMYU]xQ%E>dg1cZݒ|pwp~cL=:`-3Z=ňƠR(7ʑRQȏr榽SgO9*39`\c!lb/yL<|>ğGK 3zqkH.gK!ɝ6dW@AX:` |&?1gQ9w<ʹQ= y8"CX za.:)1=6Tcmm`*} Z] V& @TbBИ`ECr6Ya{| nawcbPz'7p6SA[)[WQy4c sĖ n3 NJ C(JD0SH`3*ƙ|ମ샘+&=LsoC A<(stbJTUՐ{*)$b*2W2g6 `xTZ hێc4 |ΦhUO!tnFpP/>l3T9SѪ*1r`%(Ɛcq&$bDѵk !fwߊqퟬ:!'V*myHrJla$9KB&j嫳'M+TRɂSCʮ͵dy P!Wzr~cGUBwqu8ty2KzH4ɍ a}wSۦ=֦mv-FĿ~u[>^̮T2f4~ٜn{opw]>Ok_.eJu"n8¢FH%X#^EέTg9:p1bڈ.Zn+AveUK|q}cOe֒٧r>6jd.#Jq6iycO^76:m7 ӭI g9ps3ο;\C}fH=hG\9NH땐Wrݑ j/3yHb.?tTš bC"R\d wڷU[ۈ؁F5m# b 9ȗ+(xD1hV VTJA9me@*s)ԀXVoFeDI)ȗ >(w^y|Q"p'BQEDbh;_ogFe=*xSxbK5 +A}T/VWȿ_v5Gzh%J Y,5&O3ϣAl9UjTFohIHhK2h^Į+eΙT@)j_r/k==3t}urI~t%Ǔ4Z ڴ-@6W֖FWJjX2v&pߦ=)&B_PkHFMwA)Tt17PkHf[m8{8knRgY\Yd +1K`߼ܛ7{t~$4l <)endă,tba[:(?S6-_4q%yfo򹼞`+ݢ+QǦ+i~%}ehw%ocM_ÎcJ`)TEhh+mBt@BYȞ 'vp$y|tx*Y]Joy! &BaV>w1'x _p1o4l?3Vl@>su=?_,'s?o%^,hgr(~<%~CVO3@0kcXHݦjVڂM̓RdK,L01u9(jV.'DzX"@`JZ \-HS\L1hl@6g,O\4.Mhc ,ٶs1#]w/nّ|7WH}T }4іW֗lԪ>q Zu$vF{a)QyrbVe Rp\)}U Nd$NN5m8Ou yrcrBS]!G(0,?<9!Eh|k="gGa74]T[,um#iJHѩTu-6MIQJd_&C1ձ1f GWڼi^uD Vbʃsj*xbŸ!HCJ>4)5S^Sg= ە Y-̩0  7UeULc YVt5+}!v<E AZvk Fz]?Ohō;Gㄎ^ Yp.^^5Y $pXncg$fK* T*Y s Qx)H^D:@%ƈFKsX Kd3ab]9(֎o(tL(6bW!HCR[0BɛSn5 z8 N畠O AQ߇XÇ.$O ) %)Yjl1DIn;*G^!]<xe\2%UQ@+0FbbKpl(`BAd(&]t̙1XWJ:NYEOkmHe=H} 09661,e aԒeq$DQ#VivtWUU]U-`$SP3rzynpvv>GB+sҠǷ|~S66TO="\~`5 ǫτWU>2y ,(vW:p]Je B!\VG-| yr~O^@:`}^AKش:PeMJ@"88El1ZDy@`yX]^Bp'eYz+Z4EcT·hhӠIH< U> IUvA2B*[1AImݎKÊˀ.s\Ƀ(VS46vV*?"Ky/K˞8+_1IDMR55g( {j?|. /T}A5\!Fc ;eˋ@998QB;37:P ġ)g󟸀rV'l$F+Bpո- W8! g.u6s؆ [Cu%^R$,̾{xt 7\#~V8M&%++189^Bs@}.ީi]ͳ{ TOW\Nޮ pE|fpprZ-7˹]q ~Uޞbrq‡Gbu$G:k#H0 1ѸQ8yN> =]98n{9r>%Fm}W*{14{\F/R=CIit*'Qw6Տ ?8?Evo?}:~o߼tL>~OYVIAPx7?ch8|ha >M6|)a9=bd??_f~Ly*]u=W}q"̿dŏ7]͆"].Dlxc\b\*CRSj#{IŜ$N JS.f  --g1.4&kWI)syvS;ʥ}tyKL@$^Pw^|'L" ˙й- ،^$b\si]KxbjPuӝQ`9jgCw'tPѲV.7?\ؼlBH࡜M·]"0wDJ$UKY ǡ?(f{_pt"uV'$kr R,BoXv1 Bw8*f._QXX6Nrw/Q |ɴo!4=,|v̽g ZbBݑ-u+!@|1pTL8RK3[JJߢR}U꤁j}RPY F }HZH8͘D$.}b"H#8f]8 gRXf2Vj}% :#n2jNp'j,4U cf㳧Ng/@" Af?sDH*ρJ1T0 xR/o^H%m3l#^w'O?|֭nÒdlQq_ww7ÖV}kmzz6+}-bj r'ELy;^hui@rJK i6>?𪈭qKG,;jWUҘW#{rR7,W[أlQâBW֢yP ;+lyBu.VYN[Yj`(8_ߒrrRHZu{:i_$x#m4ӟMi%h/ my_oxӫ,g˕ub4V@{] ؽ0]i}wl46s8 .mn>2I~4(`4/2AMپԪ3lަc?DO.=W 47.;:]VLD~ޏg5_N>}YRi'$V8› K:8,e_+%Dal1õtVY~yCu'P<]9@$m]2$2M$_ᭌpgk/SqWHC^@NʽSoT|=!uT;۪+KؒP}P7I}3M -ԓM7n|hGT:[J_NvK-hYTD!{ Y$$֠}.#Rs߹-wUh>0. n`\D,pi#А>#Adz!j'`2IG?%oVtdrOSzǼf)b.ʤuFsTt2ǿ+Gf4:4D,F]͕v {'rw*c;vvKo>@k{QG6F?TaeuR!S1B«׸Wuׯr{;tp*l{=ɋinyϖ?qCtCuTRT}u$/Ώ'2 5ZU;hZ+#xˇ/& ^m_49<(dAj'hJ6A*e.m?wnƨqu\VN[w5I-/W0fb _`O&vXNgɪ$!ݭ+guo݄3 $}z{n}ytȚxn7ns֭t੄uZ둤 V`PE6DVԦ3FalUwlʖ{4G*%Oô~2*ݸcm߷LFS2w;U 5D@'-7'V%z)ԖJO-|ubئ?WMg?D}רē74Nmߚ`j<) Vd'$SH7(qq j@ahF Vkup0gP9 )SH@Dۏ e β: I u#\ɦ!A~>FjhJ%*F#r(D;MMWq1ЄHF0*QYíN*d+8ᾌgg9'᪍RǸTIK""TpHJDHH2eY+'ٵ*@aeLI܌3 / w":MZ"UP2h gv4(dhn#@D!*_EN80raQ.j+AZ;qQ!%7IYPY Op1eКPD.DE@M9a|qVzʼKPY\PKg΁2r4C}ef{JR"%,XocRBPs$V- $qI\2YFJ A2&d]g(M**ӉPu=%;J(&J V+dWV"v\ՠ6CA a q2F[) ) Ex&T4v#]7cEfJW}k#AEq #BlH `L1R"8tRp 3k't bXK9(i+XGeұK!\5+BL݄6ÙP(qHseԸQctdH^ 2|_!AOMFCꕶebEM]Vkx5{RQR1L#r 9 VlZ42*--VS Pe#Gat aUhޣƻ=sNƅ །A˥ukk7ᗢH+fD' GcB*&6/ I;h"/ڄ{aV9~m/lB]oY47kW ew!oCI| xK!A#/ፍPӖ dUt4CҕjI#J]e ,äc`;;Ǎ͗X~ᜑ/'Hi\dUʫ &deZP96zOC2AcJA&ԭ.4(28hG]Ƣ*YȩFM/TUĝ(<m&k)}Y+N";կNnm{V.N&!z*`YeD-B&ȈPRTk>hAjy1i&! %:]%|̡cLBP=n;H H jcJI%Kc[AK^ Bk3֐h NmB[viXHUb|؀C M;B:b8i$@贑1  ' ZS0ҲMK̀z=ʤ蹐X>?ДH(A8QN{NPn!X7ϦrUrid" J.YxY 5iU.BGCtO]hEWKB$Ti_F]~:COW4<흩yd]pN*5R GV/Ϸ{q|~UڔZTQn@QCDuZ]s^/;q}@[^?вБys f;tk{ EwmHE؜[*;{Bz+5OVV_k~=#+2˃`͡WQ"w\0pGt0 \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW_2*xH p+kWg\`#W~e+b+b+b+b+b+b+b+b+b+b+b+C\9p+kWSg\9ڴ .(W \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pE\4\Zq8u W6g\Ez7\=B8b+b+b+b+b+b+b+b+b+b+b+b+^@A.2F0x/gzI1]a-|V+ᨐ=75ksu9;%Xr6G Dm\z|(tuV_w[r;Q::k8BpL}u͎.UgdV'ׯKi./OSK[}:ńln=&6φw$jտѷI(00d|滣Ż9o@{!)G9] 5qʒUCӢ[3jwMOrS~]|a5UC٘^ ,7{$>{I/?v>D@|j h<% VtNY44i`ډD Z7§Fq~u1 iNJF7g7f\mܔ}b@7bh8Eע,c/7U4P]7ç7`wYAsr=쾔4ϮzX%[M'9w&Ju$M9kֻnnjN@Y՞nn #vC-jϘS㕪]̊9~zA# Crv[J/Wדu7,GIymeO\}2-|j-꤭MuM[t*Wo>&U_OO9A*ntf9tt_gg[tƛK{·O>x\l;8w ,nw[v #6([k\>9Q/uOYutO^Ϗǿz>dt]?7z^Aǯrx>٠i5x/2Hm Yt]Em !J ,l 6%椽&~z惆.!޹1t|+:LnWǏu4Vgeok~-?=.-t"6x dp&^B&X۴fu$n0T2jB v'8d|\ Lz;?~HW8$oܜwvuz~;p>n9bN״6g8&KZ7N3ܑH;1g n*jF@%g#D˵g,Z('םZ#Cl\^5fn94/o٫9">s4o7^*r!#k|헮wAAm_Xџ/0?zv<=#:K!ynvry8^M[qTn[b1dIt7b*G ɢhŎ%!VZ*zG/| bxo>Lck bCUtOQGgr*|>Rgz' )ζ0'3 ި5Vu@0SthJ4'qî}"xCwU )7_DߦŎVyz.m/0hEG{L oy Fo뷟TG.OocY^MEqs}:G:8 ߮|֦e:wM*Re}j^x<֓D0ىTjM@?:tLWIZ!ã4-[OEeY0YMqq69{2 s;oWg:&T_Um^KhVS1"e:'RO![u#OeOu% ,:xj!I,t2&k4 Z+nZ$6ggRt'!`T+ќZ: hkݪJ'7SAGLQp$q,rmy]~>>c~nmp_o3gdߟ|oO+}ʴP:+h[PEJYl~5bt66ـΧdrEVD`Ey0&z\o2ynOx]о"TW£uT̾6%koàNgj{ȑ_i[&| 0]8۝/ $}%G߯z$%Eli ɪzzAnMڱ^<8v6Qk+t z;)5p5~!B!d>d6(e3(%eL`3]={rH|!%@zڝfbӎ=hjz[S;O[7fiqE*q-<3긔FGϭ u5 ĄC˴Ghu|HM1Ee6SW=ZH-Jz4gCn iԻ~6 GU- wۂw0#+iՇuoȣ<u#`\7r- 0[bh[\ml{BHSpҖoΠȜEHDR3R# %\eNty'rvnIG=%5I9k,QH WTy"* sROP&\{YuCAjD6NDdMCQK"zs] yevᶞ!ࣇ/1~Փk/̳Mo|>Ԩyd.(1 m>JId%K=~~=.x"HYl))Ɯ34HNsb:Dh1yk<|z'7㫻$v{nwZuo1U^n./+xGiX)ґ` G$ci3.)I\2;# "*1 ͺB;ocS[rH,dKĐgp*GZ; tKVx<MsWtث=챃Sr 1Kib^}?.Z9fR"A"3Ǟ9s48x\3:elD4S2x@j RUŅIɧU`K d"BpLʄ8|FP(SAʸtٙk u–a Nm\! Y咆8Lp-t 0AʺK7J?P|ëp5,ܟݶ/>7Wht{I[NqiWǗiJbڗ Ӷ{b!Yky)'hΣcHtyc&ZI~s|B0F?@k{Q߆lBͧ}FtJ.tV5R8fDː*nvBe,Ii8 #nT{W`;D6:\3oPpBNg^3lW|We(M*n绸#;!wU=4݌.Xoˬn"ٮv& Qws\loi.ze@W-Uց^G@8t;69Z_, b.3G_Fݗ9rO $8wx]a[<p啑Cqt_oqtyPS\xxZ]4Zn/Qxs~s{7m{^ӦĞ63uB?u_# YyP/#R"@yɩ,'D߮(!]NL;Ka l} cRPPs ppM;ThzQ;r;>hUB'm(t` &)uT{ Kdb)4T$q:~]Ihb7HG ($ % jceA6\ۦ&KxtB퓣+V yBXua=; _UZ:J\PZ礼F%ɼMϝ8*TfE5mM!Y[ Y0'E2] K`sFX@3-ESV![D9bDPzzl*~đ~N*wo*qW\Bя[힭F*sTHB6 T Xb ; 7t$쎁<]z_@=֠@#N Q[)qɗkjVޗ;k5̇fi$-ƭ 85>?$#HM!ͼh3sdQ8j~fp{ [%y(\b7ndןی˔!iuqݝb S|8 p=s4i?53?hg8&7{ő9|~ynSgW7Ӣ&K59 82zrk, $ oБzn pJtUpqiQmN9]T\(S`d (ŸG| ,βp9,0g`mȼvZُ79#K#t'ܙ7tv'/_]8 AF/zQJd:]J1h'G)R$!HOrd h,TwNTxߝyK{nzb'*\NO!EF-E0'2+KFAɓ^ K ߬u s̖X@]d69Y@ -*RAj/vZڕjML/S#LoHh~6VhҲm*!"j.z 5vMʷs˲pmOa^ ;apwTL+}h*(|h/ӰbQCjs5[=n$~6 GÎd2mk Ͼ&.] H-Ԟ3vRTP~2H(~Jƿ TQgK%2yAhFߒSC>eS,m#0`LZ KmALQH#d[^(Ay[ҤLqN%]Z#2R gL`J4{|Ym:]@ǥn[HuޑE[sFwd,,-ʣ/gl/&Tm s|& IEj26@F<*;Goo$Mr~7kCbPقo{~BIy(%E4=.ͻߔ#E_T :*'ig$1ԑ#pf=y>΍zUo5}(gS*WFkK5bF)cItΈ+eZ A(Gqo]&sxl]@ǎidxm`{2PO1XpHlJih\p]ΎH kZ`=@sN]o7W\kȇ\]Ѵ-D\IN{7\dYʲN6p]k! ÁgAB~n",̟a"MFC=$ϼ-6.$Rz梗x Ivy IR y=I%Qmw%%qϨB>=8bn5znp*|{s]}\\%,Q;iE4"p Xɽ(j@c)1k'ד:D@h^1RPq%8P\yťـ#7 u $?3mt5f 9iכ-yCy&ʺ[l'S6uUjᕏ41J3  jj|&юf Eѹ,ORQhlȎD9<ܔ\0No%r(==X'S-2rӒ{8Sg*jd.I$B%IH6=m;G2 [1vA\}m^zf 2Ȕ /SOoqC^^(a6e_h E\^IJ_*цh h-NiZ~Rh OR.h鸞4ŤcUor6@P%hL %i)浶 pX(t1q%&rp\$^PIb&⼰r6IsVu:4j'=>hӝCwʍНН fUiwkZBwIHשiw|X$|񣌳xvd>ijiR ~^ cw$Je١/aU/[;|U,8o~ߞԭYh g;n8*V6gG/qT,T3zpru^<旳q^ߌrO/Nw~zixw,~|LA'-T| 'ozq!/0}ԯk9j伳t#&_E|.UToBycѼË^dy)g%8bO^vl%>b@}|ho*۟4< :o5_f%\UwВK5R$k5jb]F:s,_.H#Q^b悌r0QZ#V*/ NIYjiBv/VKK/ԜT_s=&FϘd NY'$'PRe zM ^N0[?xAc mr݇000IN!o%'J*4QE3dQmG%DwXZΎr=!v¨FGZ:BT$uL3cm0|_U* V^_ ׊z=r3BzCZw#p-PGsrP4< `ht]Ija'~eʒ =#Q2J1c1 <>xc G+Seo6V-n2p m/6u !1&(6@58T .;g4L|5.+tMvJ !PuKPBnEv*>8/"!EPIv^$ ţ=Xc_tF:},k ^O.t6)&A_2$2dH&l _I!k lX0# .L< YW0 4 8ɕIkp%FƆTJ6qιjFz'')Ƌ؊#`3f o@|Q5W-wԁ}6qh6H5p!&.*3&H}2CU:^2x4M;_ĕS?JǻX;*>\Q1vy3;Bt\w\iA޴cn40j07]ó}h~ٝnӻ{E.ڭc'-1YjKZ1`Hmh+y-OdiquK+в a}b:qW-T0=*Ѻ^2Dr4M0/uŒ]+:T,U,003F+/\4x?h^a4hQQ BJa PlPcc I2&q*OsQ4'".p".h. =P<n=-u3PBSXwC[D&P]ױ9Q}SZR*aiɝ*hқ_z7B8I5E0ѱ$9ɢ"ߦZ5? @/ĝj2x_ksٱngfKJ{=X$W{Ƕn_\ڲnwH>rEs } w1 X$HBH.9U;' ۪{CErY#Tc0o5E1dR /P%OF6%xd<&tg+__d_{ȳT\'zOJ9oGm:^ J/R=s/{HJck dXɢGYҪ[mkv{3+uSl6YUXc9Ǣ995Gy%ZO)@<b0)`Yt1)(.)ϕB,%o39q %xL .wv6NJoZ+)<ِu.a{A#F0 '7 }\foǜg윱C `T/^/< h)M5U{n3 j48~ġ/޺gesjAhgg^z=BŸH_w*S a}5J.,}~3В{}063aE9@ԋgXTK6O.v|>cu;/8#ʅi0 ZZ 4AtsUv(Ab:n@(eVtكܧ!yuiPpO;{l5߃P{_tG#6lz~?yޮ66+jKиY{Wt!ZU+χ l$ ڱ58]uh͘X'PԚ]^?9bр|sMkjݾi][os<[gpmtAwmz|,z^ϐ)gs˻Mw ͓@-NL Kĝp3Ѯo%hE_65i{6ib2-qaKf-modi;–6Hs v /#b.H+d)HpʚӍrȩ] pQ!nH}8UU($VpI'&e:; JT NI!N>zܗණD/H"_v<$iU Gm6Ĥ@|L!ƒ1yfq+Bm, t[^0O7ٗ0!'SRQ$ $VH-SƞŎ'rZoJNǐ-3:+[);rXff_n|AÑKjSbC)Q!%ȼ$"'CRdjEW$@vf)@KYrAN۔}qr&Xp#SA&!յdl8KJg3Wee{9낌7$<6q0of+mZnAuU9"ΙƚH)QAa XY4e`2!E)H l&ݎ93{TIpծQZS1EjwFJmӱ66Et33ΤL\o-ȜEYҢì抔HB\%Z5 Y4#(!eX$-t}pR? a";}%X"^"qT1Hƫ8L\ 0D@ 3X @t*H7F$X.))r&*67n%IJ\EX"v6%G$1:;=E^.n "%%Q2ICY|C"|/_B.Mc_yُDX_1q\6ޏK⌯gayj`G,ə"Vq> -TPpcrzx } i9*\bҽyȷ﷨tt[˟ 7GTӦӫKUs;3`2o ҵBtxd`Kuvk/Nn';Ѯ2 z \ޭJ]OG2Fk4j="h%A폻Emr3qt .)t7:*Ƕ5rkB" :UJn(͢KiTTl:惨W ~'y E$KZ9PZdE>hwCVd&ibtR-\#dw_e5ó Kub"R)O&sY"hk.Ax5r+y_D h"cs,Wr6G2 # >F*RϒYqBd;F_0%kY6{>,2KR !Xa1q[OF$Mg`N-A=LwtEĉZ'.{>! PƈFm,8sOۧHho#}и+yU14m`0)TuNn>CUo~G;vu?.J#nx-\фOzIvvR,d"{Tw,w8Dsv*X^jX "[d)XN[^u9M iOU7G'58ӊ'xs47s9#oĂLkNOQ}9wê ׃.ŕۛ0>= Vq6貝/N ?,=ݣmz0Lsd'B_SƫN#VlP vz3 *Oi* oOSzqo=hD!7c$5SLnJ;GvFX>G/?f}T YBsl/:+/ǧU4#9tRi:$5H<O&!2DtKv[{ۄ( ik9 rzZҪ)8. F֒7:c}$^HL9k,#1'0#f e\f(c EQYzFw4i[W2LtL9 G#)$̭H%BZ:fT2G_R, #01eR$5J!O%cwIYk.L,k(YqUc: y9_ae23)w5!C)tLh\_i QdR[0VCK7Pӛ`}$s E|f6=k&B=EZ'$j%1!(sIyh=yJ+9 wlX^ #gȊ"WfRKǓGp30 勇řA])2.p0g;&@@4N!2g@Z0LFSv5cs|Dp߬³ 뷷qTr)jg+]t5nxmKC=ud=l}غW \akcu=jÂΫO<4@'/j/zP |P[(A%]T*f]HY&q7@pKKA/uL'!^nşݴ˖{/hia}oѯRQgwq. p 0=ы t䂑 +`&d4).uAOK^^gAwY?h{Ta/$J+vvkG$h;6љZej IQ_Ch9Wx{7t%ыlMTtJI4UP!,oFi:k[lòiR]s[hCL%Z(4>SuFmLD}$c$u $#?<ݶ0Y$pS\˵QYG^HWJGl!@94Rͬ? `Ic@ SZV>``39BBf@nI.WxuZٞ߄Dǖ@Dx#g{㸕Wi?%އaHJl6؍!&f9{|-=39JӱaE=$U$Uyf}4>D \QQł!!k*"g̺߅UfI`MB01c#LB!Wzl+=^]X& KO`IHViP`41H ?~ Nή?wl)?B3X@z,G>&RNoS8+n\وJ{+9LT׭y ~ufW JSPoRۊĖ~:q糛d0[38e^}mODl=mlTbnCG Y'ppy6E1fzzH-{l؎97Ñr7Z򾲏 4*@&~qo@˫W˗W?ˋ+LgW^©dp0my?4vJߞ}_kYT '/*Ɗ.S,sL駗?ɍf@탽X^ޕ[n'} .&6͆DdĆf_EJ]b; &Lo,j rq>{hߌ.H0ڻYʹ_~ ?rpns?ڲ!tVclg NyǦ؜#n"vtw/{2* 6+z kWfrM} R+Jr>ہUoq=$G1 (9wjHjX>+5\f1I+9\צ)!%.n(/-}^zpvYdZ;_TQqr$`R5um*  UJl?.clh`Y*r $8lkDim6\J &ɽ~rg>`u0Y4@)yt 0Xق&ڸoA\&pm/L i!q-a٨?q*D #7TZl%O뾁ŐXug oPvu}Z61Bn^@W3m:z7h tinʒA7x1^vBU fcjռ\I?[$o7}ğrM @ WdgkB9)ٙO:4-k^7#rL͑͞O޹x .b߭;h0 :{:K\AZ~=ʏW8mb _l/>H%>$&]hT}#|+bѲĉf)j uei1kԄ)N.`Ǐ\_/S s7TpNh% "_wIJ5u%0+%%J%Zȿni &c'b(u1ޚMTLbSƊq_N!th%+$)^8+,>;jզ5Wt.94.!1(,Δ" 7pV%|'&S;&[l/s.s{'̊eH)O504c$Fe^鸼Gt2ll!<(Ͷ9+lgrk\l6nh3wzmDvU(Zxɤt EYbze/3OAwK;)z/ _K X+,5sgWfP$ϧ,}'˟۳^ m^O\gR؍ 2чT "h+ ;SS$RTʥ*3MI(;lZI"F,hݶΧ=dzVeWf8^R5ӂix[^Ф90|{¶ #+Ε#Sܚ p >:KgaSsG}Ggn>Wbq{8g͞۰xa/I2|BK>"TMY !'!$7Ezeu|iJ>6h~sE~`L6S-rL,\uػh^|E~7lTK c*i812+Ҹ <Y\mys>ЈmK*Q&882~r:qpv>eJ &v =jcP@CmGQΣ0yh:dtiLTDȎm &qɗ]u(FmَEou9AN(Wq:p+ , \^&4 BD\S&H̒,gZTKMHRc!2Rsv',nPy/߮D>+WLϑ MnLSt@V]0V]S'\l*^NtދޏޕZ ?<$A1$YA/RBs!sn4E(')ADkDLN p,NtsȊxUH}S0sV80a;hd.;_H*&"䏐H%筯+3M[fZl8nvĮ&0ԶR&#1ԣ"8\M n1.V #"BkG=&Bl e=v%ܱ+vfʛY#bt'$PLpe-Ja`n&',0k1k-&Stl%ȩe)iFk&JtCGN <=ҍ4P3fLŔYeh175{Xf蒝8;Ά6jpP6jpdjGj4f ,a`"'U7֢@8V|t)ܙ jƭ 'ZtIڈY0b)&ĬUiJ%i4f ]Ѳ m f @n6ަFQ6ѡ aMm eÊ=]uLuZAi%emT#>HKrij=*(+`,d犗ezlo$ bmR-@M`Fb V9F.E+[;F\}l]H|ga6\nBv^IT,C]{!_u5¥5ɨo}]b`R:a!-&(eSѳY7?@#|Ϗ| h57Pu=\nKm_\ka_ia(yb*wX{:љ}lx~j!L0^3*^B8C TXF[_z+2.2cvjS# g>b[.r2[Eh(]+Wo#<=utVZ\ָn#;o1;-3{{u. ތ91yRMJK`Kvu [hg5iSg/X&_=饢cј50C1b[YCT Ĭ `D`'KxtZKNI{.j p@kU{&4 f>t6`p bN_`7S3 t. W;V= %Jvm >eڹ h||]xMgm ֤K6/ vI Qw=.3]c2IYy` wB7:E,tPT^f1k`@wOk0k=d,U( -x`1LL-v*EtQ g@zȩ #1Ѧ[muĔUsiɣKԼH<0\GR.`;@` 'LtEd[ \0"'lb'ݚm(0Slu)0>u)0$Ϥ5D086^@*eVj&LUrp EG ' j f @B"Fq#k 2n][ Z`ܚ<:<0b I:jqyoWt÷k8?C>-wᡛ'[29LF1 u|S]qv1v( *E$93K)hBSysZޅ.yB;%wm` 3(D}w:+JdY ქIVy*PbNXp9/2ILDHT$ qa?947*g\ndX`>d6UWN>ԋޏ =>0/z$w?w1Iv>Q)HS%y.Lpi ,peF3][s7+,lpY[LMyg]}(&uMMd+J%1Eu~pp91g%g]D]К7yNUjBZj+Udm6:[k|mrS.<堅 Sj Xʸ!|U~Ս$vm]cn\::O/8S,T42Ϲؔ^{u'C{7ӑX\s [iu !`v%k;q4&%0A4c꣙.F3PBy=9#BZRr!? i҇ VG)=EgBfJ.C- vFC&$`#.KַGv1kq(9s4)M ]J!eC}Oj_Dalh[p?p1KS`pاgҔyM˦`5ٕDR&>Yz_vwr<DcӷpɁ/[DjNsiBR&,Ya"1y&bQ3J2xMrQ)oJ %do,V8bPQdf>0:c0n- /P:Sh)֕z9|{@s:72o:d,,M KRD9ݺ߷;|#ʾkPq^哖UgB~i)dhH|V]IIbȯ,LӈN)j/ F^D.$ӱd*SjU>=ebZ\>ktt9D[Zp3@1$GƑ)$Ow_qn9ܮZ*  =Teٿn~922֪ U(2AD#w_qЀ^גI6K&՚̀R ~Y}TW^.a3Wf=.#+| #8FGz0*-W`_3"/J!.3 |t|C`/ˢqX\kېRZ+!\ݗ yy~Z c)|U[0, bD/Ҝ񊱮l0iZ_VwWmR&LJ]=ELq%:70Qc! -1;C˟eoq4(I4kCQ^,ϺAcLG?eTYuJ˥+Hˤy46b7/*rqγaWS81" ^~S#>!'jR^9/PV?;rdm '&]&,uH?tw{ K)OW߸a\;9,_6bG r1sX!i#!?lg3]'}C ڪ!B2dqK1Z$7X=sj%7%g()"Ѩ/2Y7/q-`^%vfN3G@tJ ;T8K$U"Ij R IDbUmhvWm%UuL[m7c?2MQ&'\m }~HU% !G,6éBu]u Ooҩi|HkR,֚I&kg@|4(eH2-LCrCch/\~߽—fet<3 ɞ Q}姏ݖ?- Pu%O~q&9́2߾JBOuZtl Ĕ !u^ n$q@9E__ Dkxq# u̒' J|md/n&e̘Lc!b wRcm)Lǒ<ލ؁˯"sfybM:0R4wNG }dtoh# 'fpcˊŹJ+ } 2` `2`R** %l̈́b$jwឃuX%8ғ,s'HE$@ᾴ`8y߮VZq-MJi"c` 3'v( #0;|wԥ:Ba6N[Cy likXeÚJ9÷ x:DZs]7">PW)=9HdBX'WA0L'}sm кA4v^S4y"Xx̻9" [OlIHEqKzEEͱeW߾~y,E0yәeLT+#iT0\i-ђ۲1+Raꕱ@+Nc~f Tƞ'㢟9/HȾOom=L ?lBT$J{7790m-Ңk%zOkIo !0DθeW ˼U Ԙģ&QcCEpxۮ;,@~/6Z]'#|K]PHxMYt4x`?Io3#h9UY\l[*#y%qV6vmjs' ")]>ӲK)[R^bPFZN98En'f,0:Ai<@n W%/c17g8/^DsLd9 ʱjا,7ۇOV&}/x>yZ{Y., <4gPN \QP{nuG?r|aS7=~իHX5()A8TH $F (UL]xl^{1ba3rGח+FLFQW `&̡,OyNJ q];x8c\hVx㐀-rrU͊?=xB%zVl;>MжM㺞Y_B<"Zaµ ̨fLjDX`d,V6duiK7bvӳvZZ}|n}3R2vXI\[XKy[wR 蛌Bxnh+}*ϸjUW۟9BV6=k &.#'zIMܺ0df6ܷa嵳c֝bN`8'頬 )ƨ;ƌVgV0Dx51p?ڝ\`y &! V.@%5Lۙ"ϯ,G 5 NoByh%  bҶq p9͜?N! z8v eU\ F9okI Hḷ>ٔ^ m/W}!)\z9cԱh$g'$f*y= #Q>)g\aUB:I< ppLN3K4U9S0y= )I'CQ0[6.sY;Oe`T\J`~W|g.#kT:龈`,&~A֧k`O=iwGuE2VTcs"LA`_b*|=KR#da P` *)mWs.)kXzou003ͫz9*1[Pj/;$S)m9kY[0V8Cl+K83 IOeqt@$-<<:$b$,EhUR:픁qrqB8ZG`zs^$a<΍fwbvg@?*ixO!yA-qHkWOS$ZDi-ӑ*u&i N2[&qAΊab @s$\J@q: p 0)ɋ*_LA2/3*b,.J !>!>l\a%WBC@9g eяmk4'Y Vx۟} ~n|D /,On?ӈ2h$M:P Q$T:c`;V?Pc͋w2,4s'd %B oqI9n8Nq#6bv@TzZR ZMIHJԗ$?2%%J,.%qA>3$^9G2BFVd2D!,9+4.th6 L!0f=5KKBJ:9i j›iERIwPv֜?KnXZ4b7<CKә!IT.fO f1)kлg_Ki7_cócYF ].qN*>ݯ"|OJVtU hV Msjũ]51QiG=F".`rQhAư)%<6 g'?Bb.r҃-CJRU7: :po|@bjKz_?V*ܣpwG>̫zs, =(>| t sÑ&&.+PTwB{Hk˿!%%+d:&,o1r @5 !ªk?SN EdD cdÔDTM;R͂#>v #ٕ!AV T )9r׆bvmZ)sֳ`ijzraNjuN:}vBvu|D )csCQ}{Q}߸\"lmzNL53ē|DB/?rvT9vhe 5.MOs"r݇MFHx.>Bp*k7X^dic5 7=6Uy ]}'@bT*7ŌpizJeTe#՝Z\ xoʹ`M*utڌnN~%\k*>pucfwmk,ӯ(Zt_q筹hbP4Xn%DF[LPU-jLʅs"D̊H`yi17 ;Hao΅^p:_j??D60jj-J_\TzYO^-8nն-F;J+lXut.tRW#hȘK\WHCz1p)/0՝8q;*֒P|_Z-%:vAn!4-F4Wq#7B;YWIپsdK.E{I2Ҷk\jyp:V0cWH%t־l) VSZ fʧIQy~s:8F/cD_l-&#W,r;ؾRsm`{1hIdW2HuwKp+j/2/sdWA*OSjV/8A,F8YKn>m'yZ0LzER%c_'\dm0ߧ&mx:+ڣQn-o{k+rG7&wt(CxUH9OIJ%}8m=]TזKkH/i׋3z4r7bi}!' ΝKf4N*IQV_|q7ymIY^.\^x[PՊi kZ i(fLm[iuLMuA=Iw.Γ ͊TV ^<}c,˱%&6)³ 3D BP^DR*pBD` Pq%I3>>[ `~3@Yoۮ Ь+dL ⨘t1hyrNӇ DuT_AefA( cJ!+iI;6:5PTgfvbmw !yyt7uac/4R#B Qȑq c}ҝj&"Ii A]{Iw}pE.]ҳR5Yk`nRQ.kxI}:Fc> A@Ts@f\L v< s)HOQTljqOz@-'kޫzs?hxAƀ(9-gN5Xt-NYg3~gB5:4,ՉG-\}"(oT֩tze3#WePĞ3a){y܂d<怎Z>@Y,e?V _qSp2V=c=dv#]z,l= uU_$7S_9(55 ?eW3AWN7kt\ f!Jĵ5J',HNBXN6 kAn})tL`-cv7,3'Xä\,nےjp Ju$+?8gH+,n"@ wxb< Sw%Oe9ZU?je}?iԻ4$ ~%+Ej2>nnt5,ϫ&yS,0QhJ %( /^Ē)u }ʗm[=c:ȷ#Lb!~b^չ(ͷw\ gX {|HKڸSں^aS d9sD.w.M:JM"t|"hΎB&O&I& m$"4̗oF~]̱{z6Ԙc<Ϙj&j6Uc5hC8dC{mq IGy/)JSo2z]VD4FF{+5_dAIعt_*j+TRP5gOw<4MOM^] ʌ@_(MB,@5FKR&a9JqB v!",7QG3Q^*QGԹYǠY_+E<.L24IP9˰IS J% Z"2N$pt+o`T̠qy0Ɯ=9(&0 !1f%h;{(C_"/,%5$ē%9&)|x엏޴ZPϤ鎎t)nATOiWHnOi ޘM[-_50jԚʇJR#id4aϥ*ȒW;e9<Դq'<9H P֛$~^o)pn2^ܽXRXA-s[{HC/ =˿ytJ³۽lӖX0v8kXcx@)eChO #`1t(u 碦?ASђM6ڻۗBGRbRpdBˌ.0l5|I6A^B x*47U%2ӚOnry,yرx50.6d=CqrOc̶[[~^~_P |@T1eTv-ɮ%{h4 wP4A`eM'ζr)1QXejXh?2B *@|7G*5#7w!5#HynT)">5vܷP]G2ƬSZT{x2%8 bӫ\ ^rB|J#uJ$_&6:u_DPkQ8hk+]MḦ́:.mϸZ/ZeF!olC09eF;ic\J_x9fVĝ9UJt2Fd/Zz8Ȳ܊^_'YH΀KKqC~a$SFz0)~wC|a`A֥x,7 07}Ḍ$bv'ڷa*Dԩ' " WaM!q+*fCDx7NVeH÷$֊uKvss#\O]SIkiTEZW;f5Me_"|cq;&Q0ޞ󐡆c|pE#n=d\uV_h @\:}F!ڞxiþ:bI giAql_7 pzsipPI+y ݃~\ϡMC{-;7 }:oׄV-+>aqM bScyo ~iz0lTfsl LUd\88x@0\JdJ"{⟦(|LnNO['E|~lG!V99N9N*8ŒaR1k8q<:*nNԊ~[K[KMͻy鼪a*>L t9&#Vb[-%(S,(md[mr\c.(!vyy{^ՙ/[\;FeevDk: 0S஭tdu24~c&1fk|ך^rg?~ qXKw+!'p2:_} reͺYMW؛t:=w$6ΎǷ(HsHt{{њ~rKkAFpLT{ =L t{un& sWњ+s<]T+f 'XVc#7n &䲉4 HE/~o+B(2Cd(b* G50*6{nie i)zo+>m@ 8X9T`UukK EU9Ją()a !"$C50L1VgLe!S#Fˆ>Hx2< j,L'in"t"xPY #+4ϋƂyiJP~W t[X[!D\ .:LG'{L M2"k$E\DJHZm?-{;(#k,otQM񀽻cn#BHQVrOl>-^~ZmbpeHul8°# ޞmǸfOVnLnJS}co`ZS'&e.!F}Q=+5׀-QwsK͜k {sjƋвҺ`xiؙ$,Lwnx :i.>undG' p%͕2b1rE\Gp["}}CZFߞe%6^bB۬ NQR޺Sq0!S֠m'=+4f=d@eLqlD9e@9 (kF=utNmz;pS4րE|zrЌźӚjH:YƵe`LV=&rYE'XG4u1Q7\^{.zPF=gbK?bxVҺ_"dyu780> [ٞfJ%sI-ۓ)48XӗZ+I-* 'zlr\矾 OY/t-'/oEabnYϯ}KRW!z'fipt LT- =g.aG4 i+}jZzMeJ. ?c?W35Ǧݖ?Op,?q<)~"ڥ_tk]Mǟ/`ǽ!)')IidQR4ղcEFl5b,q s 7HvӔ^WWNћVb2vwB]u~*qᘻBG0XN>wo)Es`K?迈qv,klܐ5F)Xr*w ` TazkaVz?6i_]oc?Y5NW^adF:LR&8C3q4JN0!IYsj(¤qNͨ:5tlA+,N  K6HEc2FZj`,Ki\WGxGiE ]ihwe)f`8sKXHQ]6 xX|a ι1<<$nU[fp P`Ne:LAJ,q PxB(Js`m)~SPM50hbBX4xyͨw'u e.qhxO@ N;sPĜq\n e;6BZT(hwu0B(~OKd+V g1ZМ1C E%`AD`)SP pȹȄ+2ݾתw`&&k3yX]ů+.3e 0O)C-9UDg!Bf06nV h4e YЮ?!AgM i[vE]ojoj 3 %Ͳ)*02J)9!RB2; (( p[o}gSFQ3HV y uH ޶Y0(a,]<4٭p85x}әzn~Ƙ{-Ѯ8)_4OI!S .@:+ZQعՉ@JJ=FڻF"z84!Trapځk~)SyBH<@麆%)PݦLf8G50*m&~qڨ HȊک^]s5 7QS*7/lP`&cGM+}n .aa9XeZ94mӔFp2 KYIO3JRKh퓇 C,҄Pc?t/p`4GUi wPZj 4 )EUT>[pJ(x޴mG_G1)Edz}x5͉ki-I(I50_ zT=DЯՋx˖`pc J/h/Xڊ=?k6gk8A+m#ɲB 5ažC(n2O3ӈUERI-v}nH1Ȧ҆-t8q%I˹/i<\nd/D H_ J8vW<8Zw8^ >+d~[չJ[q`q 9i)")0B$QNt`eS\8iAJz Tk@c8&mJ}Ly4W^Pň풣n\ez2Q8"NYp$5ѷ5GKfnpLcB% ^Yʐn-V![4]mMwK;aTC#ըaE+*( 7?܉RavOT9jM!@d,_EҘ KѾp C{P\.i݄N"q\jZAps`M&D b2E_FV46cs;k㞤 p//`TuNK?E X*rj!!Jy5"H;16m.$|]*3efvL(4 K$cfs*#g,;@Zb=3:l8Zx^MFW6̑2 , y, krQ/^6i< jea^Ռ[2QTIRu֢p4{VUYNfXn, 6/o)ySvӽEcgoc-,'+cfӛǓNO U0%gZ}zhq?'g0O?qx| ,y '+}rW:,}^3szOV5O6:"45<քDFJXG:`xlQr$Q7eϝz}K:]o$bq.8/OY,di[Ss.f"fg?R>j4%X!B"Ow ? erxJMb+7d_zX?~+d0 nB۸?'8uWc6"l<\c*8Q|c o̅l??4R ~7k_A1*oӾX}.e[g|6+0۶My,EOˁgn{̥?N=#xU.΀ܧ]/:#5MŠ۱*oH0( ,;k&X8 muykC_i&4ס( P=u*v(3Ѫ)h+Ts01t;h'zлʽ5=(El?Rw5!B(fnS/#aw8J f3&hU?q})^!r v&RUW*ض41OEDLI"8o`0‘#ɎԶ˶$e|믧φ>짳;/֯P :#iaN)2Cq>] ݰR\(޳ ;/E[B\”^_@¡XTq1G;h<΢ s8O*}4ЪjJCSelgq1Y!rcu䂠$8x3D<1 :c0}8(mtNЬ#! ҴbTǩHs[_Z*׽sԼG& )wJw8w!xh:u^ XʘEርo{OX @IaBcP cMR&(=E+CiϤ' zUrJs;+nY}9y®xt(Ueov8^@meD&ՔU8]"B`vtXR.αsL8ot*VJt$)JvJbe2ky]5S'MG,L3/щƻi<0߁=Z]u{mʒ#.>O+4:Kc<2zpXk2FHMKOMp-b v M3{R-4v [RyRKמ oǜ"𬢣 bD/ rQ([^lvC'$_ ֧0X9V9x{ +0[؟w>H')`:J!b02DŔg|\ 8 M}|9 >9`PY9+,bP1T}!t+5 3١!wu1a岧66Ja7+-[6 kh;pJ,Y:W$[9Ⱦy$9Svy('>`‚u0`!K1fuJk7h.[HQd2$^Lm62pboXnDypցm VuW&*ddX#AxsތhDS.2`;Pw5UԷmr);i"QF,?X`U$۲/֣OQޑe!ދs ʅ.!pqJ(8W<b*J$hTg+?6zW pm9E.o+JQI]̺pT$6ߚݷz+՛ӫdт*\XD:vdSw;ƨKAj-K4T Cɩ-CZl*yiW4/ _. PVOӼegq3i-5?,|3)2vk{޵a,}^XV*}w>ln͛lY/x7|M3`el#W\5[:܌OyO)Kaf8_Z "˯2>.jѸ/.o#~ty yR瑤Q.:mvRή 07B ۥl4d oWc%P_l 2%e`C.uAHmEnl<|ݣ JuЫdKȽ?c G&r5wÏ߫s瓳+L<0@ .)pyL N/s, ,va7A% X;pKKVSZDA/$9 _XfJyL Q$4B;6!x-i8= LE̚>P|z~hb=quOFE~( pyBA(AQcYD Ϩ*e!|@#Ï8boB&̥DH#JB 3u FH;H Z9ǘ屣䬤͒ AKLW" Po#n:Lշ5b%20x뼋:t 9FxF> `((X$$5O:`nt`]V(p2(Y\r XҳDS\a:`K-P\0+aa秛?@BJ!QFpF AR[հ K0+'\T5p&G,E6 0 %֘Z)BP8z{kXRd<&+rT/su{j`#V@& %O,ApH&(J$PM)lv=EZk]:|yNGo$)(^* D7fpZ4FmVJ@%A/$f@w(@t4:$c@G8Έ# iO',栐&ap ;|xӧMU`킄*eo%pp CS4(LQ {XX *f/ `B5 VvKҁ^j-^n$鵁#z?]llIi@tU2b0'烴V)c"K`1JQapeB u,-U JVj4DfYc%HH:,(82=^W˜xlq[ޕq+EзݖxU)@bov%pm%K[]s=nYq98*Wʊ?4`B3g3-̆jr;?Z (fށI"`)WR8zi҂IlHfteTUh(T u '',/ V;$+ ֑&%ey)d*Fɚ/Mkzŷ@͗xlÌg>7L,߉Sm<;x<ȁVλyDoe>|a]\Mwn+Ӌ{JҞأ|faӣw`>;x4 ?c&]MI Ta/B}Rdllׇod^60"XdwWop)s;OŞ>p@c*5%)֧~lFN CrDTFl2h;i'39zmʁcȳYmA4`s$cH˪D"[FC搽3P)VFmcfĠց "X1"`Bh}}q0Ƈ4>D5w5=>̋LYο.v8v\s|9?kH,'s(FTe1cRzd1VjՕ21U eX%o;V!KUm?њEZiELy7͸NKÂc}^W3&ʥp~>x/Is6$,k m?LwR͚{td)/l.=I%u'e]LNF_ :m5^GЬŷbhqk?#0b0ngYz`K)&vphӴA˗.[c{~Xf"cEOw' GݭT8R ꂅ6aw+7%ݥ]J4svy!ZQF7Q&ѩJ~֋{6T]P랇4=i~9wBi9Ld:; w*A*Y0oWrV鬕4>N7]T k=4_&׳~ ꜯyif\˔=sjak.qYԝ9Zu\uu K R{iDZZqw#6WxPbBwX*zF.=kv n#KdrZRRhTژUK\Pͨt9ooRF/ !E~2g̊@@h.NHaȆ2 .խc\dh?@Cy%~8z.c;1D8j P?*KP 2t"W-o[#tB-i؀7wi#2\N'םVZ T2muZϪUogh? '{H7+@Eap4gĐm1ћ1۱^[)%<ѩAۣJ- wکؚ{aG<)Jh3F1;+h3*gt/iHhTF7^OsuS=ՉNK qJ&OGfXzlhWj9m>^ ugP/=7- ؑuQ=o5nr~z?!luaV=!8iFu:2ñ(NE= igZQd۾ h6oGO=$|Ϙbhz1Aٜ.CWgӾs\5tUcsV~,D E6FD!^lxö߯YݵyYNۮzz:8z-ktzjҾrXUuME sS% Q++]޽x4ڭU)*>C@iF Ix3džݟY aGOSVg3q1;ٷšK"#o_7AIQf䬲*7\@m6z.D{'UPMioW͕y a@ʡR۠hwhXca)#%X SE-媘`%K[T`KA zF _Yψ~3zvjl㜔R*ᣖ=AԐTu_dҬqKh$W JYdBc"ո6FrMhuR$a{S)i~4FQbbHаt6H6LV'}dBCO9-ҔRN!Wll !~hд] QU7RzŦQMB`=kXM;~L2ya0%"桝ek(I#XN9gJ& $^ה @+k[hCss9U2K&z9MZ܅?]ȪrxIiaWXes UNh¡kvgs" [/Y9W@  Evbۯ:g,9tJ)n|CYϼth4BxH"dڽBhg:>'ӹMϴo7#nHxzt3:>: z9HϐCwgu7?=` |N;Tog1oޗ?yN9V;'~4On:yӃvV`9yd:€;KTVU`sFb"|`Iј92h̏\_cn̖&mI~'电ǘ9J^ {LǫJisP,giiwW ޙFrvIINKX>YpP%1GLM VĜ˝FtJho-Afmkv$Ԫ̓*QNhWzg5fwX`M.? ++6xRd+fr?LC1[1ڵ5M;l&V{8v@,(1b0;Hfa NtFMBbMxKN2NKQAkV_>hMj||21 >ϓ5%֍0`K:YzlXy܍J %/ނ{J^$V_o4Ya@ms0"&1R# wtz5ٿɬ||Qdj>Og}>kmї ofߺ/*DPu6=&Br{˘_i2ؓhI7frY2%9lZ/ mAl`3͛.a%qc0<ǾjP0޼SEUW*݅XK#,j@S(9DٗI4ePڋAD##Qx%G9PX@U 厾E{Jᣈ,kL~ʭD@x})5߆śx,~e6Ʒ?jR< qϵ9Z(U'bԌ1=Ġ{aa 0Jƒ2j6pߧ]3ЛoL8s"uᩄ#%L|LV4T &^ysW6丷rҮs ӽ2^KAS9)Zr&ᠥNﳆ1VhP ~1x 6*:\|!ޛ8iIMU(RT!]Q8٭_|Ũ"c @o MSЃ'osUZ4g7?$5c<-(,kMǰE&ܒ  }Zsl 9A{A&~8^<Q"~Ϟ~-(wʊ*ۺٌJ0h옜MV4e]ʵ] I frcA_KkH,TdҊ.gdЋgyf2;)}"-؜Nԗp{=cotI >C'fSmZ m3(grHO8q*gQcxہ=+~WlvN>&JM];t2^ᡘsL›f*}\Vo\]7zkcۮ dOU5bkf_JlPڶ6 }V\:btZIlz[Qdr%_:=($dg3I}7,Nmz|#pg#zGw)[>IΌ{"aUQ t;c(ٹ`[o#FE,̔0)#icD4{Il#qKP\^`VgIzW{L7J<;!yx^{՞gNni ٱ:=30RN|w9+-C_ƾ5䪱|]8غ?Lb\LK)ɗ` \"LZ\<{$ Xw3>?N)5#c7$ȁJseiZ}͠j$3Ś_?CoGW oHǫ5^ZV՛SogR,߹2umju` DB(N(\y 66?+H=Rޱ{Yz|nFH49Pl&(VC1N۟c=F Ëw.-647 O'90;7sLL$#Dwqjf{IS[5$4pLT1%e O& 0Swg3,{W<5C 4ɶTֵ$o7(ajqrN>z6Id]X&/ +;pzKpy`z⻎Lo\67Z.%|3_6~ڍ<3OcMK܁R? dGO1]!5ȱ zzͳc#2olϩƕ$og8ª390,7{MX2WR (l+ Bn_ }wkwe#)}ۂN>\KӇN'EF%'>=g)J!c{o?|>A&&1'R73UMa(]_WZ7*2"duyW2s:0_?WgLF GrOŸ}8|fN{Ak)-ȼ%NGU$hW^@xvY}Vɣbcl*qiw~0ُ|IQGj;RV=d{p,.%2($e\ŵ8ƻR>F߫ x,ι7Zer NafS{FRq}!dCK0pwTOG|91QŐ  ݭkG QLs)ALͻP}cuaMF#u{Y13CMvw|T#rpk2n8pXW,n}S?*@'tGyeb%#qtgv E%\j=h7*5Rh̑I`eı.moˮFc?yQЎF+Cl^$P@Ž/PzWݥL8l0Lf/=KN_!УǠGaG0CBnB#ʻ ԩu) jk4ͦKdXDj8tI5 -*#dLk]:eD.jlsKBc :W?Q;|3 FN)␱^`1"T(G3dvv& 5 ٩c $I[LHS/ޡ񡰚&>v\:;6yHGA*=Dגճ%v3%5f 1 lIm š$|'ƇkVD7+slJecYw[{ڊnI)# Ч60c.˺SaZJj::.a~d %2yβ6Tkn#ecjgO 6-ɠc2I5)(NvK+/ Ȯg߰ 9T3d~~̮ڏ-5|*g :)STgҘie[[a`6WC툇88,KfriIgߴ0Q c0Q cZ/[[^obp8Hӗwxae)^\}Pd(ﴧcLٴj/4NQĥ:(rڣk%\6 Pf@rxB0z)S€ VWJ̌VOP $JyP \z `/@,,юn{}f9IY8e֫&h+k gȝRg'$kH1Pdo&s( 1c:؋K Rk/gUڐ7l0L &Id$b:r P|$-rm>e kyND{UNߵO'#ff#OM~-'=r('=rz9]$:2FOB8Sj^N\"n&M̖4bW( 0+$V$]4 *l$ֿ"25f2C1x'+αf 9Xtz;.b]4hNM wyvuRMѼcB.vG~0 6kIA?ezY;")rD!gH8ApfT*bhDQ8GEf&ɔ='cJN[ZƆZv)eCEN( 4:7Y Va+,BS.NK 8 {~UkFJbiQ7`%;Q@ɢ]`0j/#4 q87P)WJ$i3Pg (-3OStq .GE;l{ҝp1׽<8莧yԯ;ISmXMǺ|m#u_~fvqU/NN8;FrL VJq2#|vA?E//se6nr8&}ly;igA;iL"]Ctҽkiwr >"*8#QW+WP9Q"j}Zv_\t}QFUNW\r|,\WN+6<'Bˆ޵` `ZD%'DEAʚKa-`<,` 3f::Ҩ 5O1㠩d oSj) WˆSH+3ZldBL%'Ä@^Ka.uK1Ƕ_wl_R[.dVW\_fRRbl^BJ[/#DHU1jscW.**s'ӌw'U*q\e6='¥Pt%pUR;UY`rT]FjV &? :9'Ya!YE,H`:hBZg? D}ɺy`N8Y1 sptTDj%ՉRWavEoQ@/ l%d 2/,'h%O'ȡطzFDHsFHL4t_>9^Q>_V苺22]jBY0}+ads4' 1Oy bsJ9賓.>#/Lut CYPF,VQ軅B:I˛5*Xڡ/.(%$T3tw1k:!-W?[q-C3|]*F!2we96SYO^;Pb3*X*Yf""3\Q}8X#?.łD+/oqg`j{R׍PA7 L[FcdNP&:\ +4ppDXx<5!,ƞۦLlFgf[\}B8.<~0D}VoNy-hwVĪʱYYYBL{cˢ(†`J6ڃ6A; 268=*@'Zi_;h3Kbt7In3F1 (tV΍ON`N5c8˗^$ڽQ^8 W.Ź,,mu?G}y op'=c>jK4ׄKo.EkauR!sɷD{YNIͫ~[+8qvA@2R+|ў U( *FCbM+afJ4rAQѓk;aoz bI跣ݙo爜98bi6!aVZ>m0#}X>3h~Ma.~N}ktZ-޻ʻM:_9)LjTMV,VudLWtk1RGɂ0Jk[!< lkgA1-(m,MF[Jvlf5b>jmA]w0[1<'s 0NqTyq'+O-$zO0BM;ī uH#9҂;GZ-*qHpѾz~wmnRR{Rΰ77 {.Qk }8/6a_c؊Z]]*3o{×u!{t¡{M5xh3M1\SB:5?vS7`{ӂN'|u6zvP2ou nJJvQRw׋gJ+ƕq qg1TR}*ۖ:SE&|%Up^V (Ag kCTQlL'" ߀Z+,R6U|/NMj[B*=-U8m:ɮ3w r7vMr2Nξ=LX&aĶ|fmr5%=۳ :\轞?) *)·.Ne]Vmr.Jo\K@ BJr+oD YOmNbu) pA G٪RE ج9tqS(~m`ŘFss`!%X4Q` . ( NNZXiRc0{.*C\2VVg(&ET](h8ZVy%$lN&j 'fSi!Kr$c\蠩Hk#FuFnb}sZJF9gnKҵeؚ}թvTdd!,jE 8k)탪l ^PbbM7 \:>r| VFﯳw#?B3 J!7Au_P"Xk*\1Rz{U@VuYooΪ+TDb/sȌ)x.g&#ZADP3B%[qHؽBXq. bv H(fѭ.%@MM#kuG𝩤) G|K׼Ar|p=GŮ˷QuLTۻѭ $ڪqrJuA{R`«}w/d{?ߎ4~Q&c>*7W#k{~J*+flY$k?0|8COff:Goi9&j{Vz}'_n5urĽށ?v5KlZ:C>$W(fiFZ4J}rSxfX=KNo-蕔9.i:cW6Ws?]1d}zuf2w{$e~J]M]>Oe7|]-ηqLv46zFm:̓!Έ{߇&n'e!G(* >WܩξGB3=DՏ$Fu8;\2k:CUQu}Xm[ehŃa4Qk`y+5)qC Fo0}+E*{V?gIsztusn$مtvHiAc~ytߜ٣OhUqٳwRJ4XиY66ޡT2j9Znx}ӜwkЇDJZƕt\>mqXj0LNh|@LU d2':_wf%AD%>z,CeQ*U$ 8+$qCd F&X&:܂˛WŸ^&X6J{VMnSfjrլl˷BvNoZ1)7.1M^BZļX䊙%|w(JEtC[Kܬ^pO-wrkϾvEf:Ϟ|-ڰo$iz=ͯ8>^/kHnS.Nvp,EOa!0R Z{{&;7w. ׵}+e>_c:RUS3}sQ.{ᯥ"+9*Ih2@iD026EoR 1tCb;ZpAQJe(2;*A۸RD| 1**HXˋX|O.16h{-H# 5/_keU{ՖHt_'N٤A'2л )zE]ѱ@ce;&mQ]o.6tj[0Cڶv;sA_ZIF,NwmJI9Cΐ,EһX`w[Iv4w(tH:"#r8ypF:`Anԫճuߡcͬ%4ӻ˗~PcDqMvMr_SѢQf|@N#@pUb6+ $@Aȉ1l)EYͬ=4kk9>X3jK}9[G{5^~si.zHYQ:ΔvP%kKMdAi`Cxdr թS%d2-bhBΙ.5B,Z-vXd Թ,E.FٛHZRM-Xs"[c_Z-|dID()R*h) 6p_].f:zIT6{ Ίj0NLXrthoxLF.qb3k qvR!X3Z/u0WsԨ샱q W$:># 29q3/'W1 [WH]VjJ 6AX[*!1x#Rb&K+Z[SlIQHlc#"dL!I %\Y< gșT!h$~;+U8NLpn%43Zm@l$]@q] ܨ.NSmd'G?(ZC}?Y(1u6dEAT+9gF^m)ȊX ,%gTJnȆ]W])Sa]݆-1v 8aüJcOJ؞PYLr,QlUZXy.%Xbro?/y\֍Fug\J̓:zzbM{sP,\/m`Bf.1gEs,UnW؊Ս "a^^J!9P2Eq$P}KAp@M:PrRe _=d_A_ү;-pEz]¨;; 'e֥Yi8b9 bhŕ#?؂]5!ƠI(}Dʊ> y/D!{NU1ZB+F>6S߱j>sчp|vTz n/:fqYtvyr&sa9'<YF)YΈ1.$zትr97]Ê(0kkQemIE_Ɩ8Y]M̲hEQp+|!e >3m#㟖eOlJ hJWU*lMma6r*rMXW6?Pksqj) ʛX###DL U),"LA9G% cJUU K o*d1 wZF;A^[MHIs%맻5%(l׈a.Ccm%ha.>h?.KzW<#C&(z97U\O LkGv|\ } ->}v⋌S]̄9Y'Ac]pdl "JB26 1(}Y=jsQ>_ЄPk>kٱd@=?=8򾱁e; CdIpzcK aw'!-~y+j4f#hjgwVK(LXlRTJVdc/+̙xf!B>PIPZKi>t{v;7l˾s+v?z0Z>rv0RS7^PR^L6p"d\[(VlĪ@@"fctQkE*H,_\4ZgG[4Z(1 maBD ;jF'=bjܽtrPN`A8 b2B*lDd[.IL],jMI8h Y%:. pA1X~s) ?-/WzGNfJ>T XQt{ý]>kv}o*݆w9L&ϳ%)J܉)Ns:fV^kFԜ*[Dc;$%CQaהAS`3CHφC5@0}(0C @0FDD~} <nhw^7vdٹL|2˧d}mݎ\5G!kWfRoe'](b}j 6 y8r9;VkiؒR18y>hq䬝MU"XS4v7Rھgطʡ]&eX3 1X!:*ef_J btEVhUL޻u ĺfvi/T{]sztc3/)ޅDԥh?vNQr95Fx Lf jh*~[-٬CsEPLީ0/7#$M?3U A0%SrT7}C6l1 ZD掓3(!8 yO Ơn$zC&V~Q68q(Gda10|Y"Zfl` MQ ]:47}b{dqjl}utڵxŦy4dڶRjn:UJ[?mB+6D}zD^qwXڥJV0:2gP*ʵL88耽j!w@1FQ kYAN-zhC̲C]KrI]<(T+]¹:(cowx3-zMrŭ̶u#wj cTKtk*Pdh fF"rh[   <<>U&.zqydڥ^q:MJy :ƍ೎ `,y$*VL.o |O]hXP11~:&x~CזtL@2cr5V`:&xz[>l͜VnV~\+C?VzqS+mdȚ.8x Ώ^&*'1so맕8[xPlD^:[lfd~-ǧQ6|2{:7Z: :X #LuY{sZՖ>.7Y3pqɱ8S$8=N:,lpZ`ze>MpBR8y6t󑂛3+fik R[8ڗ X;=ИvGxrV?RiU`ZϿLr/}c^|̨IQϰ|f8?=܃][4}ொCt8Ĉ['^ ].WoG˷kÕy!:A,Kip 󶃞m e9Om4,R@:NOE}yV/\8BC˅qw99=uξC]WUhR.6^A`}wkcٮMh05!kw L 2ۄ]H=9_ZFY4KlOE+?f7}ko Kx1゛>e\!^=[[C@6f:o0}]AKZXxn(%̆M)r{`"H:yOW9MHM;I1GSgwkRdƗbԞ ~umo]9:>OPƮ=2\CjJ jwֶWQkRngIVԒ;+'tu:V~#Ɂ"^ FhZ@=}eEGRR/]E4aL kL~E/gGR}{g>zoS훷w6V6^;Gn)HȘXnǬv I)ۊV% Žw=uBj:[ 7d9ĀL{܋T,hoVAK#׈oJkem$!Q 6#bnW6b%@cp Xn -gm'==WDdGln`:,USzBx7{jq~.Fn6Q9gڼaTr!cčAblA`CF+fu1(fRc0Tb\>K̨4D*ƙheZI<ѱyZ?d6f_6I67hV2IXGiɸ̉ꙊqCRn !4ݖK$N(Rja@7/*XarՇW+%NzSs\rcM=݉*:1-m$ER%ВlMb\P1dV)'GsYgd,i0u H ;G$EaR>M 6>J!Ŧ*V|xlɷ⏄YfwF C60Fe_.HdH]f̟eܗ |YޢD΄[2ڤHRhEt\$1Z$Zj5 Z}37U dM.]Qkԑć"EK(eȞՒ+=hv)%6YDt)H:t ѹ+|Nw>4 wtw _}llaD^19Q1#P"tڬ,z.t!Q99ȺmyJR♺5tl^CL;oԺ8SpbC3Ma>b! `ܠgL;i7K# ]9GavK *\zO49Zm 7Ӥ -C=Fi`ge[-Cc4'=UZJ5'*}ᚚSX|Xvg퀵oEp!曭hE͍{榟|@Shمvd)̣c46}jӊVL&aSdaY~QzG}^sߎߞ}|F}&kX%h?>wLtZsդ4W`*qsEҺrлt__F3d( X'VRѬߴ4=#8`8pU*bkJ:rGgS:R,KM%#i\LמsUfF+rϸX$y*c"f];{;K~'lOݓFlTIӴbPoC%TOR}xҖ=ﲴW=)HZғvۯ :J< uv W;`l}vz%ś`¸]ik<U IV:\20F ffl] gR[ڣb_swlR _w0G_^wI+9iC,Q[OUkM0oQpȜs_rH%dMmڌ:lHqAd՜6[$ N1@l|gRZ!1ik8]Wʆ~jwZppea ^š Bca}J`UB7]7g+`îox"H N"6" Ogޠ>Pˡ%Xr}F|icZozl dp;lȃK 6.wr++!}.}"Ml6_lm­re\!R#2!F[9ˡ{ t$+L1kiUz{7AJVND9ˈܦQ,1ӶMLX.PAy,͑9n]*oȶf] y .# ;kuIEIBYnJ\h4ĉ8]V1ˈI 䈅6v"3 8-Iwߒ 7%LnءyXj,m3opФߴD.؞!Q]`R ٫T'\=$;`eE&\VTxc JhOV TNw>Uj|kÕmKs^ 0&hD~ѐ @j&%ܤlP}[ Nͯm=-yYElRh^_[. ѷRrt&(έZ>T@yba+K'%Lz:&e'E)I2#<֌H/O]mH祖=u~bԲ)"#-vN-N!s:^m?J$φd'IEsͳ@H"nSmj)MWgr|e@$MS\.K rOk/i"zs`F=}^/*,#MէݻnFOdzh~2!-Q߀o&nvND Jgc<:q-2"|c먷Fdǖ~pu<֍hߏ{$ҺCp[i@Ig|v{>'|Ź~B+yNaUH+hkIF2&UaXU}˜#Wgx3N`ڤ*Yf,"RZ+U^WUO6%ҏ1#o>\>QutBgɥW꫉+4O/I9|$mO;zᵛGoޯӯK0[:ӛ?/s}r6}@WsKwK&, )&E9$c I-zR[J DbY]WՍ'J;1\t d%BLKRU|4 uό!39ŴtoJ0e1+(wگAWv5FT\V,-c%XΤ2<5aA}$|63H5LJeQnoR( I,iu}12 6h>LT*32h@vB ko)T@vBvdU+U}<M:\a  nSmv/'R 4.@f,hwΗpNЎӊiF;_,@֪>}x[ 2_U-es&{F_MƷAs-Me<Ҙ;d*R 6߲\&r:gbqipi$ϴ1sK(sC^։BU%"/XgbܘLGYWV?X*TZAT^ڂ`PjA5-K,x(˖9׆Z[p0W).>K(W'%;.~ӟQ|?&dk.a}ϾKg%sKXd6F)d`$yQ&9 ZM22gګ>.=6 ^E}rqB85l`QI@R.ӻ{߈i?f$Z$4%͢Y_.=͜*]$ mmCfd`kG4B1N4kuC,U|_g$ק\$ie̛Y0:ϝ+&n#yv08@8@bf'h5.H%,DW=^[Sk Ev659;gx+Uip'vTOl LTֱ&M%8osTz?Qm. F%A/o~H!'"I΢bh9g"xh|KzM-{ QqRd96:^"ǓE I3x[1x-f>v'p C42>^}:Q()=HO&b̦m!­@-!dM\9Jq_,cxGŋR<Ӆ{IRoH]䄐BAi2v <};*-mD&%hMb\jq u8Q=cxH럓rjil?g`iuE3v7{M=] :uZKFq݉EZWzk-wUa 5b=O4Zw-iv78 lH I,">suK~ϲ~KC+絣 d&q"bHZD|%1a}X'M_o'S#mQk$rCPLJ 'J K ۃ."txs"D% vH:B9&g5]S{91 \Iw:9Օ INRN>bğJXIﴨ7yk O›?_i-ܗ;4=,)N'.Fn6Qi??EBj~=l{#z;X冰%H[sS6eHa񷻯_{$#rUq \K0($;X vo[МV]lrN!2E%%\F^w#wBkǣ5clN)6Cmm&:J>V jCw:LA'݊5]ⵏAk 2c V7!DR+\cWj66$EءV0([p*A.n1Oae-(ҳs)-~2UKnj'€H;q}>I;~^2ڤ>?;}Ky+{L-h$p7{MN_t< ԣ!Cb84Rq+_>in·_bL>ߊNϝ"x~]6):/4#H34#H36ҬBf* אK 'e˄'ޕqdЗnS.d2A|q ikH(IUX")ҢbWw( cL*$ )&}b҈>;zq5Z]_ww;?j0JCpnh|5N^] 7*')M6sPt2i@:uHQB\FCM!$D8j_2fQ|Ktn4%aѬ"#Ab8Ǎ8e*2QrImi΢Cb' . u$ 8QDcҡFi"@cȁgh*i缻Fr:FGDj8Bhd#(\ҌPXbDZub ( b:X Gx.( [ oD"LnP޳~sy_ƇA>L~ yeP#TT4A~NRU Xa]MI5QPpj><gnы'o~o9?ƿw7ooڠp1Zp8X:4K" "dY‹a G>]|:SHȘ&54M7)Gbr!p@DEɃ嵳=y>#Q5Ԓ =M+ƴBi'() *5+D'P7P4<*zΚN$:㨈 pPk7ch$@n_b RQE&l,EZVJ z4_,BPNxsNOB14,'|gوYU6E5Ud8H `DEqDJǸg,m(" +&,@=},8tF4:aӽ^ۮ*n(ۖz9UZd$p> 1 [r)X0Y ^e<$qt3T10#[%LCAbȴ c]bH "aJؤ hIENbrFV#E+^c;r>bN [GLokˍAB[dO|oG=]'\†O|}4ɖt*Mw<BbQHZf Q_ンF=; u%l9 % Mڭf :ED?܃4f_Ǜ \@<^ˏxJjV)?.իWg9'n37{!r3q0tR!2ѹl;diƵ*t6qp"Tc&r>N3uGL-B+F#u5`M٦m(]"W~X 괠|+X,T͘2jA.ڰĎxO 'w Dz 3kixт&f 5KOmeLw}CRweeC8/ږۡlU| 1/!Hb+qΣ/ )`Tϔ/ Nq0i~$s|Ι7}F-Yy9 6-]w~_ksmy:ϵ]ڮ u!x)Pa% <3ł&drB%W簉" ~̫yl܂4a+j} 8ṩۜ1NEOw3یLJYkf¨fQi($,!Wb/G/S*) =m9 wàS LJrIýW&529ӹyA3f xAb6l)SKͭx}J ~/[ߴ>PgQ+$#T"/,kdchit=ahH5ʕؠnyh1g@eyowYLhּv- P3G;~}};.|C'+FMF-?{tW0C]Ovt7%W')4!gw.oTw'}?V[əѳz28su=p'S(y ڑl_-`0˿HE t3<.Ђ/Y/c47<iRN/kT [{yB(l|6ʙdZ,_r(|y/aUW=h_Sg{S̽bŎ4ex\ѭ Υ.}9]tӏ m9 g-w&AlfХ46}ilO Tn{6a",ݾW|SJݎ8U"{Sv{S|syN=v Ԑt!E\uPoH,)`KIPIG)XIxqX탪 j65|qs&Uc=rWס5hxR?3aVCđjpvjo %؝5$QژMQ r;;n+݃UҒW~ fʃOVYA:|ht*zQHbH3['He ^t1h=*$%wy5@RMDh;˂L'5xHIrE"$>x*1?Sp:gC(ՑN%ɆGY!lLEI4EA(5ZA2@#X4 "z4 ka%UDa7_ Q[ѭ`8]'79!`ٜ }5yzd6yҟFhzzu.jd̺%EMD4Q8Z&ɢ(\Q5 ^ZвDaE+ꉪ^0Zr9[H5 Ebj(kg7WN.5ڔZJ h jjtA5 9Eɐ\ 0`3jGOg* zu`=x iV:P@,mN0%(3h($: y'}Tahћϕ_a^6/U ϼsMi@sp3P &r7CjD/s^t'#@hK.Wnzz b -M>-`lhjюsMKӗ&.!SJhgIH*d'~ u޲.-sRw{Cl^M%tsHһLQ@cnclvA1tiVb,NնrwF=.oM μ*mOPFhrALZz`m˾^)լ};1iv/}aIΙĜ S(-&wYBJRټbhfalUef.fZ3QELZxr0itݴQ!T3I>Q2)ҩtsC? g e';^"$YWxrwU[ATeϥye^}%;*ʷ]ж>tO_3)^+'kE?UvJ]Pc4ۯO(ϳwTaWOM]T} ѳb JY]FTg1bvD t]@WE zсhcĠv9ei[O%T{n#zu/j%WSA\_X9։mz, lwoQG8)&mgDC^*g! -x{w. LnC@[ PELo 2B˹^eY{kL_^b})O b]F=3<˷ٚ' `:7["F0|aF +{)ĴpS:1 S0U1^t Tc-(U7^BiWfq}u@9)['($`,[CY/K5cAGY)xʇ?{6?NrNvk2T23[W9Dxr d_0bIt͉<)0HD>0넆!sd\y/Dcđ\p(3]Û3I-û;$ m4nbMh@"(Fq d`&(D֚^Ęg%ρ ZB,+`n^S FjqG6m! pa4I&76BnQ}ƹ|C1Gh-룔pYx@pGlXFX|"U[S<TԣͪUO6*yţp搇7X ,f%ݻL[FѴ#txÀB/ 0@{ܶ3͜mi TЂ\>؇F)SRA(B#p"_x8B!c^ :i~N`u$7ߜ$ec?R*qW:bD]j!9 $`E*0{ik ,iPFvt4aZ+yXA0Uķ}&QTLGڧ$ =Hc_jTQAPYL\4H.xGZ@Ƥl$'QIiov$*( P8լE,:r_bm~SEiFʩżdJ]-׈X)JHsbT.c._{&7e1%}%+Hsvƻ^*|t'v/jNv=^i@8ա%*(6TkI]w{\pEHxQ\B&G/ 0[9zcr0#_, 7f?ד jhȸ^-[>:BɭaFk-jRLf~=JVA i#^HPHAp2rBS ϔu_49,c/E]a9\2}y3g*8ԡ@R)hI+!!G߇ki o E6L0+3tQi`x1 }[̼<9K/n3 t=j)p8ǝ3 2W¨s5*}h:#HA'sҶCnWcėdv5jj97vIVmNŒk5FE˗D31F`=.޿,V!'7]&w( 7 %aod\{u@ȱ)8 GQҢbi|9=M@ (1dHp0!"h:'݀+Q|ZPBQjP-H$(%k66*pQqoMA 1PJL at 6+h2MSMx)y"嫐D-Ǯh/V'Jasa+o_hti{w kٷ8 {.i3+O1BOHqK$#:vj0O_/Dj']$pH­FLŐCU;^p]NwYbq ,ucJ'6?[LڸM+shf710>M`aߤRU%tgaUyI)мS]%V~Ԓ}ŤʢE);A{=@Mɘ0eb +cE#vn 9 3k>߬ȵy BxB9,E޸5١qBPXlɲ'6W +TXp45=?~"7*,_YŚ⭃Vtwn|#p+uj/b޶jEٖ7!ԏ-M>%?JL`6`=( Q녲o˞>.(uN?TA'C!j3*RJ~$YH@x. s^+APhG(Q]I%{7;1 6ޒraI"Kʄ!.TkTD]W* TC)%jNk7Gdwy7;KYCW+̙F|m9OIpu5+Cɫ٘&6=DJCy4Eh0ѐ% dn`tU'P࢞N&+̷7K ;NM'7Ct$eAQfOi-:Dtuz]r|^g}$&8Yk;.)Gta8cL;T ?]+Wke3hb B!<ʃ~# }@@t02@$u[#Ϟrn0 tr]'Hw{Ė*qi2a'Ǜ =xwm?`#9܍=P)H$sgma/nli&R GZ m/P?iOv٣aD%z悕9hg,A7!QЇ2U qqYք $CBA;],j7dryp챆84+7̵_Vxu~^>. }? +="($Nsgv;|Z= 4FQXs",Y+ˌ9ſ2K&!rV cK>ӫ{߻ ci$`&}kӯ_|8y'{? g:3ݷ핓7?_N/= 0q̋f:挣طaÎ4 ˳%gq$deMҙ䢽jM0+Lޅ7qZxV6n*?~1Ⱦtq=`~zL _3觫?'קlބYswt6F[܌St9e|PʄHHG}s7)SkpWEF[V}N4 AAiQNzZnHS6 xIջ4B`};FWtD1| J@D`Sf)˜ /ړDd'{B\ O侗ӏ刎CW vPBR"߫*CBԮwS@B2A9UBA nߴ)4F,gCgdVIHPw1#$Chlj8v'9QZp-YɌgq8X.2 E| aB|0\,vȜ=M]wߞKh(wm@6 8y`ץ,ZϦ%I$a— =\P~(~"_axƷF_AN4fbO6-v̯O2~$24{-W3^ U)iLIϠɾ(nZ!3s޾L^$-^ڢP?N.oƙ~jIXCS,J| PڃON <@È{ p ?X) w;QPq)Ϳ6*>Xtf8`p: {-6{D*\>3Ď̊ S\F@(F# &eʤ[6ylp&N,>kuNp|˳[I^r3OΌ]f>FI(Y/7'_3)l(B{(fjg@/ 0UyL].W ٯɒm,J%f>>$k}di- p><'#iP%iZ󕀂oǝP+â%<-(ưժPhi}.0<1&8>4s eJ*HE0PȘ!iRy4>yyWfl֘NRRPwQJnGvxiʺnG]}me[bfѧ` =HXQ 4H~ze&aEĶYge&6c(vV(`khkRft3󛽯y%jKHXΘf?H(A E BIV]d`Q!G]rD4&,7 ;-D$Gl!$kqHE=r/vd:ټ`y-$?դ4CIԨ),30Gj]_Uw]QDv{/f!r֗"lPJ njd @]z:p_FXj}1j3\-8y L qf(IV>{rXO^ py Ga2= GsכrF,10\_d ux P.wDBV!4Nn3J0VVkʨ i$p&"kUQ Np+"9a!($pHM$>Gt\'tm_t>zWd9b`n}A/❣٥։kW*S&r"'+sGb88F;9q O6⯟9[:cLq"8'M΂_璄W$R/AXZ0FHUi7؏qR~~lj?޹??~tߔ`?rf!1aEsPJ+91Qh`?؏<ُ)҂1bK)f2X~"+r"]JJæ@$gf2QL*#91vjdjC!^9WD] bIOy &*f~Žlj~7WydGWcȌ Tw姠|Jpmtv./gMYIZJ]p l j8`6sL*Մ}=8$c* b0I{Pȱ#u@w m_*awTq;th= B`e 0X]X0DpF `Y4JȀ)g*ϔy)M ,m_J:ھ|E:TFD<ز/r r! &d.CS2ŗ'BFx8 KXk,[ba Xy Y因ȩ`'" mxaR(FB {bv!VX\)E&ew(v1^]Yܻweqʪ{W@ԁF@ -_VZ:* T Iƽƛv , txNKzVңнs2~)7SSj-E9X@/Xbys%':-5j;~vH Pk, cˣ10h]0rLhcChx:s]Ε󜹿U-΢E';)I*A][' ,UԯcRL. wTir:ꓑ6ym% fA?\DщidU{4e=v}5q?Y |ujŞqQkizO'oB'oTyC9k Ʃgw4w_!@+ E-C|}[~d̕wAEQ6%a9:d?Mfn@ V:A~& 'f4i#E lNHJ, MH|gKfjvv~zTn][BY+#玀Y4#`Α\.9efu֘TnTRUcxQJaZ߷o^pF7'ZcyrA ;ʙGݭ/>~v<[)ǔ.\԰e/BÖaG^\X鯅+d@z ッV*7$ eJגB"Oߧ9cGFBH<+u7'k)9%Gb6z[BݼϬζ*kH1.km+mWVJ-K$mõuߥAxQՊ}~Ko w39y;5Za\M y{sk'9٣rbKbfda[+D-y}>}}M}nfOWd4̹xP^2+mWw77SxMξv_=v,DRF=;Ӟ^}npl*ofUTLZEf#GxtQ)+7\z讑\\w-9tw0ElְWqg/f C=CQHU+0Nzbmb71ianoF711Uln6*|Ar$_URv%'󽂣Ekpp#$뭡4Qw~]88Tn=OOУ2Uػj/H1[H<&]]Z+QҊ]h7XKV_o'/I4ƄW$Iy]~4"'q?Eu?61JyD\ )0)Q^:k%A qBjr,&!MjH;x_):'טJ V0$VSʔW qzl r4^foD6 F*Z`֑/*ℱE\68p5v1|lXdǚgrlg}Bk0IV1BWJB|,qI)=}Ფⳤ _2fE5fPLA|y{?e:ďO>fxUClRb<*<*isl? H|&>/X׷ӻqYI˳鋫y1VH:z1 XkFB+ev1rD@B2TOmF?EW/qEv8oB D#h6$HS؈jTi2?[4vW)5X;Ƌ5:oXvhWxK#ZT l2R4@`H=dϱII\@($AřBűOnVSִ;5zb ?ܔT ڐPY=A8T B?iP4bz$&!gS%4qC\@yn^:B0ԻNqs_Vh65w}=}ruM6<ӈUhvڮc( ۝J+rj(}mƕ&UZkJXtKeX8Uu+*:CzE4 aN)$ DY"l9f;mkUg ;AVH5w;Zl IP0Xp2S8vFgR9L Vmě޳"p7ZE; 2T9's)8w){70E;3K+Kޗvb$Ll#[ELo`Kqbf:܏ۏށO ;cv]wOMldnWLMy6 `ct^1:5j5ijݙ9X^.ޖTĽ i09cXʍ"9R$X`KlԚFk𹶹ha!%݁: E0"4#*WBVp49 >t‹`s" GJ15`[*_r8>)s\X/gE&յ4QR@E2 sJ0oKVI+k&O|97l0lj($w?OcZia|dqMO/C,Ie`U^03ap9SSFHN`oG+uuOU v-4 XhyX5\}j̣HiK)1u3JW`drG۲R('(JY1`Y06㮦Ƥ2ltIHk*-KOw6t;]ۉ]܂BiM-Z`F(]vUq=A4@i.ݰ6%D$@O&yX\σI?e?ZyC&B=jyog"2͆{v8W_i_d]n޹sTrXzTg\l^Q}/*A*tGl~{9ItKW7?,Nu}~vgF>G|2BCtiM4q Җ"g4PJDF\dBB3OsߊZhk‰sH>I]nf1I_Sw;SׁeM8Rջ+Ć/wq=  d}QdL̔YPH5`}(|]\́Diz0ejPwamzxQa}<Q~#@ut }ZlgF8?r恖`8('a+_L*r`>y{}W޿ʧ4vj>ϒ!EzApL{97 h9 ˤO fRBˌVdT|6[t[pn@`?qq0K|:lg\2TJjVN&HPث'U(PTU@E ҃Lg-0}NOi}8f}oWFǫk$9t3`_~E7&y?'A{KvVxL";Rkw[;7}͚u[BĈĿcM˨j3Nק]_lAĞ$͵9u;wc6t,ݳ-(";} Vw6(;XU,Z:4QZ'*%(,ԕt^GĠ]Y܊tuEkjV(c(Vb6+W 2@dDm &t(},StY{Co.Z\>0 s#w{m}t䘖וN-/\PP@N&!xDŽC;#* 'w{" q)&NZQ`8\!*#ȸ/|"U\* 'yT.SsTD$SbQT:J3 / 2JUA`蟙do1[uwaKqRg4#5O_6 BRXxJR e EJ39`ұH jaJ5uA`.0rJaJwBQ>.OboB۹n8(Qo]ja ZPmcR4eI]ztdSl( cYm±:td*l#yGI܂7ڈ~_eEb9 EJSz[`Hc^1ʊQP1"~15.mf 񌳰LA --ϻ_R%BqE"; Czh~fU"$/#+@MH0\qvpPV`ۙiڻ-[X;i"<R(VlWy:=$^^QZP&Ţ-cKHKus* plꇝrrt:0TaXI.Ivdж װuO4>oMwlMw.4#xۗWL:0`][74p"`]h\J:.>YkO*B)f›xEm2ԉ+ /Tъez$Z*~Y~AdG[e+0_LyġbeAʫV e@L,3 k &04{&yѕo;cҌBK޸ !9imG6lK 5!) ̞e濴!@CP(l[^ k]ҲlXcZWs(iAZOXJQBh-)`PE.f JpF)~iݜ&L!eaAxbVk?.6 rUѴ~(u.%fQ؅p{fd,g0}F`$҃d?#[ܶt@@;zoE p}O&Sfihpysr%$C{3!MF|CbuC))bÇt@.u~Ti(g]Я'gg~t\z 쳩LPq{Mo+7P6j^:s%y]>H &{>n2ꦌef5¢V H i,<tΣV{Zz׹ފT޸u`_vZa@eߴD{8:nDVt7 !-Ʈ}7i*^}ygi}~TOO.ֵ37GSm0Ŕ).j Py+5rxj'W1؂~a;RȜ Вx>ἳ_oGu?cn]@:Z hЎ[!ylB/8_=W]5g2Wɻ4۫/7CoꞮ$II:OW ϪG[1V˴  NBd`]Yx^m\486Vt̰t}lq޶y;Q镣$m{p,agd@;Q;8$ű8~3}5GacP={%_:Xʁ};Gq2V"Ub]UWx +I;q9J>eI${*ar7*&}m hd*SଡY$v;6pT@SSI'9Ǜ%Ѝ(T_5F @TJt*B,0Aw+@6ad#-Ta^ O+k2)\zeH\R`z4pn%jf %Bέ$ "$JoDW=C5&8S]_wZu 0g>b$!LG D"%e}XDmjآʫcn3F(X{G%״]dB/Cߤ^뉬B3;7ax_:U2N S%b%n2gNrѠypT2,^LV 9(И(0XEWCYDͫ8}39Rκ^z 1 fIJm $b?7sBפCoGp?De]-'cybH64\X0}"w^^?1䴦]JsfVȈ氦ΉKaZfCpY tD8ϤMa\|S'pTؽڜkր‘Y$@cŐq\H2M֡Us[.իԠO1\lo[R!V|7O忋z_<ߛ^o{cT]\.=\fnRKDk:VIuvEbOUk\Ml^bt)%];?dzvpSo5BSqj1!5t*::B0ŻRO0Jiw6ch2jiPly}btY#>Utp(3v9gO"idv{;x۟}\ZEpvhMRߡlwpnH\6}4m"smR4cgBO;g}N9Z A^4C5c) 6n$^א/Eti2"MD M3d`J!dhQ2$0LG>TiJ1*=J&Ω\Ƃ@<)Ҍo$B-y}zOĽiKnS:-L tm5g9*5{*W+*6nT@`_]0CZ[cF v,YLXZ^hOV8-ZW\ iiViØ"Y(j_{`{*!'5\)rjy*\"Mj °;D=((Uinf=w MѧUPxC!Dp;D}{J pKJ{ԥȭj>fvZ4(>+B$Rvj\ Aj[ bPSkX7@J@RN;+of~~żgrߋWT##RpwU.CofpKX|wgwً?fۃ%0 n[. *zxc\.%(W }Y*_ew.jjdEuiR3QZDmV{sB pq2%~t1t!e}o[ޛ ,+#D(+M :"(J Qw ޡa6\] `)t΍ 1h¢uk*X*$\p\,|Tf:ƒaڑ>1-B2'3Fy+5 Z0>|ǫ?fKwݎa4dQh9vxVȕ|0]{+gHAF=i`~ڡͶS+Xe!ݍ9"(,6)fI Q؞b^T~ߍt ,h;!G$(r#(7v!<6&)s8,7lᾴ^9+8Q%>8MQ[#B9QةƏʹjO0TiD95}0JԩJ -O]5azTn.=D/|Ⲗ*'BvT>vS}8j TVl){ؤ66zqX]f~By,p2aBiFH-ht1:`muXcRozMS烪mjlҗ4l=;6uD]rcn 2AvitL-MCtUj'!D׳i1bPG6o?!oC·y0傁jf!,OˁASfL"$ɮ6`ԾJbk&8Ǭ[RD.`=}gM2Cz=7ޭuv[3noʃ[lX,`(q=q 7xn_f #;ԔMZ'!fMCP秊ak+~mL[]AJj^}'ROo}`{!~6U':8y'L61snu9_)cͶ<2(Vr˥"73\kl࠴鹊g?W Gёoh|y6Н6߫c*0*6~zQ ܴ^`)6'w5`lf G 8$G%C]!~ZyCmaBޣ3zn i9♯[`:M[rdr}7?~p/.{;|;67sZRjw+rx4(8os+u9w45P7td\cΕRZM'BpʕԚ }`-3J]RzɵoUH1D''9,rKO`^YCp`%j)ҼEvTW |X- ;c<>_%j!0ALz䕠q8L zk 1Pݧ2x=Smk?r--#RsE]$gQ,Żp}pg"=Mw/+͟\vQ}8M6{juJ^^`۶L{vwO͟5w>^~762~1LȅY3om=%$^hsj^w)0v_pL!?=G—/fi*G H7`YAHOA׈~ kCHwDNTD2'Tfi%1!$ rLp/fQ}7 Ӥ [aKKށ˿Z ?> /<ʛk&bt&.4rҮi/s\.>{(_5L ʗn>`'IWy ybNu @1%;)asQ$)YS;5VMUT=7/V8  ?8~}˜VP!۔VV{ӋrשaI[c["d ln?OY.OA9n|Qci狼 ַѫ+Х.z@?Ph@b܉Lf6[r̜٥/ocVFT&i_-&ĆG3 .NLҷNQRhڝom#+,}MjEyR7lΕdw?^3%0~=%%O˖!3~{?̭ g^؀(C㎙:hymSknLy0֘P#E,mj4_jP*2f}". Pc&FOrGP@~~RC-cĐSTeFQCI1=PPc%^aG}O?{<G RJM1!28OQe: *)X: @Cc !STcS ~!']Z{snCz%`-SZRJ&x1 Oi%)] 4vf;{7)"['HIrBPϮĽwFGG}!{a KF̮-Αhj`rU}cPg?hFo̶qXl\z'-XZ%/g35Rb|,R h)[Ekc,Y5tNCnWbu8*>@wFC"YSFV)F|*SSu;EK+T)8D[kC8dM8FU7;ORߗ&߼*|ΕEN( /~߼dxͦog?^?VEPէ/f@};˳pX?Yu Jl 82Q,YR0AEBSgPrdH"]"( ST;SXxYW ]Zf[өɒ" `P#"WX0AM!YuWb(Uz7{DB\>Υ.s aCY%fm` E 3VLi )fN3g%O,L Hƅ*`p0S wcb2*+LXjyb A,9vbÐJCZZ>d\%vN&)0Rg$b]fԦԂ`B |W|G IRt*ΘN\p&*Nd'HWY> X>%SpNq,rjpJ=/u-BH WTl5m5wci*n0h3ҭ/;Sa혇 cAB `+ȱT1֞%U7`@k}1JY.|SBh HC.1{&ő̉![Bd:K3b j i 7J,cBN_3i8 f/M| Ě Lay j^-veSX+5RET0V0J& C&Dc]Ѹ&4ɐ E(K)?F,9&Q, *ѠAK * - BkPEQhZh1Q4w6wk~*QIĢ\ ʯ&[vYt_ߌ'*UH]ry?KCe!"˘{=w. އd('O[RnU;hOqztK@3!L!"8arN3'$L$%D 뿇̄Zk 8..CLEjP#ѥKŰ$H ƍEs0v)C#H_X}iUD0 /h[S=Lu%{ˬ@* < $1߅ڈnF`G#T#`?J>!K E7b)E$zV(? =*!D;7{{L-{7SvItwlz0W. Go|& g |rg _e+Gs+oWtcQ<"rǀz #"@y(vnL;A*Dx Pzۍ`ЖIt;[*ݙ`VEoK Wm}XUK׺=DSL΀hruyu oD.!yauo4wΘ3hh7h|[GF0y=: 4&tEthpԂփ )k~ێ}}\@[-OJ9mUGC5j/chKDgPi4YQJ0\)ǔ3f)FadIʘ@ߢfUJk81"8}i%NA$zxFM\qB-MRq(JRn( Hr0k }y?k*e,8Ajnc4)88\D hZ ] ^h݂mpW΢!<%@ (ʩbPVeʩg?MꯜjBkSmCr )}MT(B1:c4n@S֟t ^h]|,S|Mk5A t0%F7кҭ Y4xHcYJP NMۀqk)&݂ZW!_9ԩߝzPI2 4::F6-%M/tkCr=R@ms= dW.- ֥ +;.2.ޫ.3nZȿ{u6 L`~`3zM\MOǵW&k]4 ]<9q=>?+\8`VS}iY#_+ť'PzFu (nM2f F)LO'b! v V&) ֣d,9юD V8`H@ǁF0. Gz_i%QZi2*1(C0BQGrEp4P ,;/+%U5CB42(ZE(ok,Gm~se>A׉n|ysKۻmĊ#e?޳y߉SWwcfzlӫH^J;%5XdՃ&!<-L3"/mYn|r\V:uk~mgor{xS2;s:Sq`-" EYYwo^ȯgOn߮Rww/sWltŶ什?zs  _d[H0I]I|tq&[;C )HNIE$Ke&8,ST+9D9A9wJ&?&fQ']+\0+8Ӌj_g?a[<П9\-g@t#.̎cTmv`JaB8{ LՕYk=?ܭgntmq'̋D/r^[K3YS ,(P׻`2fGwtk?F X<)׵݈ vwv7!&*nkDޞ}R0 1ݯK u~=_5/ͪ, d@Rۍ]w[w!#=^fD#B$QiF1x;(E\)ƻZyg}<3~*':pxiz @0Z5Sb+Y^UadPvw%4 jp-!i瘓+z来g~}Rhó@k ɈdaBrKwe=I40Uya]pˌaIJ]ѢU݃IRRJ$^5`]$E~G\,42qrR Eh$X)e[Mrm3|uQi$"wӘԂI0LpB PS1 h& =O|ED'TdhhdLx 1O))!I* #y:fl#tJE+>A 782+KM"(M*a;emp$8;E |ϵ= a `<8%0JHG&Iqa(7"GU4'QE#v铏 V theM/0j/QWKB%q^(bRp[vݬSuk2U"@%\ޗCH{ƉJ1eטCa1ō~)M SZ?o=ۭܽ!T͝b9wZ!5ɷLJ5Iwɴ4ݏٔ1X'r\ &+|lvhMmmɤ5YȵxVYO]>l".zl^TZ5ғFIZ>F tAZilc2?Q׾,ρyJs ^cML/[M>d_ywo{\,[{͒4,-gW[Sc==D fi A#H$/>`F K :xţ3?Mvu~O7zrd5ybɦ u:e9_,RǒcE0r Eq%gXl$&,oԳȲD`1fXYԙ&-/|txù0wc6Ƽl_R Ҕ[8<`|fͯ*<ŽIMnYѷXhgBXSNl?*dpVvU)]>D*D>x,WzBgypՅ3*P,+`>@3Bg0?.Yr}.M5;uWtӲh2K 6We! bH7daO`Jh2S-? l%2'eRGBL%v툉bDZxh!wB׿^&Nt^Ėb/`rpjjX8źRUu}ނ3iɞ3΅:0LÓ9X'i{rwb_E;OtOEk຾}wܷz rQRRz%FT9l^aVUXUs<= ־V)ϐ#Ux:`{N`gk/b!@ 5BHr%MJՒ|] 1'c'|o|T;k+Z` Qy[MrA_G#j)!,lw?OĄ<6Ynq Hs#&_ } *$"e"{$l3]vee_sv'*& 7&H3)p9;! 9; IO)#/ۍ9?>$ ;H9Ph=5uz2KI4)f;AA`(w =ޛ@JA4 kL5>tB @nYOtN@ eswxTp{C+k`rQb{HL(Y|9HkZ{ȳݘqo %?lI aDp9i|Q} U%y8_eɑx閳^W+ҝ(d// ^# baY; |%yN.6€snq }Է<[ZY;w504)@—/h`=c;ֲop,ءh^JF O9r̥#$~v©⋜Hc|\gmk7 .zQ E1&N$ҀO``d`sgدoޢa*ʦ *93dkw0RJHl3,;j\fn<ܝ0у3J|:NK܃ 9nfYAJr8Z˷pa䧤l=pH^&[wVt)}t 1!F !gj'D LސmUS'ȯqu=msW-(TE7KmW`$!sj9yCӢclen%s)2q-A)pÃ# J(>VL5GXvcw0^3 z\߰`<)zo8_ k)=~hD:_/Lj0F 6†%[vK0抍0Ÿg% FpN4hA5dkr4GM)S[-3-jAIL&{DԟX߯)<ߗV6*X?׻zB߿ʑT_c{S>p+gmXף&Uv'/̧7emsT|9.D.~Rc"e{6Tw@~~D^;ۄ ln竇E}gȻI:[we/">o?(s?gY7j~.XcN<_I"Eɋ;wXN+1꾂4w 95ɧI71K<NּkL\[ECbƄy-izfl^U[ bVn?k҃.b]í& ژ"Qb8PJÁgQQv@RhF({11;tN(o0[ϛWjsFqw <9j<BG.9\TK8ƳMIgYA9 P yRs9oV+Y8etXt -q6Bnؐci+wg}L6TɡJM_ \RI9CZd8u2`[PyLc(G1QGe$ÅrzRhyzP!h )[@&IOn%Wɺ)Z >>C':f# MMNH0Olw1)]ѼeH`9#y)Tw' IveX6¾˒pзrz*mijj!BE2 (`IEJIC"el>Eu{r{ շJx[2?Qp):a)/׉osz p^RΝ`n9×a/ܽy5O!۸}NX:٨Sw%V֔{Zj\r6jdvFA"Z1ro{^ïL('t`YY>'$p"({ӾWNs:eAIK $ߟr\?9ew'A;٩җD2ު?uZ|+FcyϺ#LB_2(zs@LM67+rL} e&JIt(H̩@8J5aZSņJ(YQ{sc(SMg9x9Rdނ<#Z㴾DPsD!I藙xV/}TC_:Q/q} u;Iؚ_%PIR;ep1EpXH>TߘĄdB~ny/-k5__>.}5)߻i*蛊Hc_aUU;)`R?Bo sǨM7aޕU]`{0))1J'ž׽1+ZGe޿P'z:6׈c!=zfD04E3OkoUo#2 `خ@f\#9ʢ`^ڼ_v o9jF֛XxQA&e&[|xNccJ `90 kSGh U,r;̠D8 Ql;.D9#l'oQaBo}b{>RL]d)eA/:ָ\RP[b/l*JfCA8>$h3Y(Ŧa 98P-wY{Z[fB1gLY܀IǩP{t8;W#+45 丩Ώ,TTAq9jʩ(m W"%_U[ 9 oa[ӔbIЁc_YY:T CZK6kALJx|PG#A[|.%)?M睅5yXQGWABtV-]l]ګӍ{?^4SM(TuMdw)xVߎƱP-+16nZBPR;A*:Uϳ d"[Xf. O:>[6Xgf7s vVBtw{wgɵ͛w,5So)1߆B"[Z ~g_a5p+& Z9;`aB0"Iu1 VB8`K' M8p;`(:oCunga$Re|B!9Vאjy60X@^5,sTVdL&n qҍ9g`16x+;; xJcuW 21:EhYMy:sZ,i7_ (# 0 <'Fn2i6+`6B$Ý8Px3jq8(jw$Y߷oހ@GeTsxFEmI[N>ź 8}VUw=sA1zF# ݲmDZ H VƗ?#@OC} &̳a 9yfZc>9ow_·Ww=b|R$Q.1SEK+e\K#HTo2{clzfxIBs%^ /)8y.OcM6CSUk 2Udߟx|{_#(Oh{8Y(=,'zW' `G46|DAͣNR4O 'e R sO'NkcOC͎!RB,k؈ƷÎNPv8 M[Xn>dc5L)8TkG d~b_'G3w{9Y8*#Չ<3#pT4U KR|3ϝYۙک$w0;<~BG:I4 ϵ0J!j=NW%D7!y|(kcodsY568YP3 O#x4#/@ghK)ew2x^V;{>[ fg7gЭSf;Gƕ&;TPrF:6g-Bݭ%xS.z:u!rh>PDmb% Ebu Z!ec0g١qŢp?́O} ˾e1VJ 8.T`c8ƌĩ&X  i-N,P.Y((}"}q706DOӫ~zRڸ)4u*6ŒFf0I ET8UDʝHNhQDh\V(;hf&C>j^K `3*W*UzͦXƺ}\~^|.]jv|]p3XfF ы;gF?>/{Ќz3lt)}EZ,/8q|D¼`]dpx1}fry5EfJ㵬zBNn4p ,&3>0?Y${SnL Cdv(Y6xK^/=.Դ!̰DDhe.+|M'\{`zAGO32Çwszi[%{WB$0[^CT Ix zD#JTQb@6_Pa,u2>ػofro^x+{hDޔի@h{'TsKu/ʅq8WL ^vus3}Y lB)ikheFg.e)ly˛v8 F`dzJ(ʲ 7`_PU@Gc(_(uYƢZ&. NTvkznm;?ʻ|lŚ˴Ba\ ~FzFOÇ=ajFOв !AjH.+Bh3tȫ~*txx*vv'.OE=~=}\;o}lƾĴ1A@Ict*Ҫ9UA@8%X"Y߫vz^usA࿜{Sou6D6;޷CpyB5K"թMb j Y8!XD X,!$%1KtJns700M A'\)N-P9\Vy~*.0 3> CTWM(d+Ae_<1:X,hd<^vKϫ XizϾsv4J`ym2>e|z ѴrХx2v-PLUiȝ?]Rv[$YhȵfO&9<ڰa4{4fK>z꭛wK#/>DS!P4eEq#Xqp|N3/0ތ7ɳ?\ Ԝ񕕰9'5L$€%1!LY)FNX&VNR#dftpzŮTeNc~i"MV]QCPvQ+j4oXM& B;Bh,cNGtsTg T!ԉ,uD[H@1kkg쎩[RB.ZkdO8qV#MpDn* 2w+uFBݠyzȯ<#T+4C*6M" ݓOE,h]\벻]h:lGnI4ۙU4" !(|5ޚbDK 8;1EZ~{7``hǕ*q;FH٬;J[2Çtb.l$~'cDDĿӃ`.tQÙL*()e*9NfMH( xMSS< Aex\ʌ5?q5pdL*i)Xw7LIL? O"8RZrIdlcqY]*Ls +T\SeEy*ӇMcle`[Ƴ1yϩVjݿTm}YRz *tpV3{a[o9lݿ[޲e$·u0&%8RiQ™@ӲuipX󔭝A ;#1$H݈{DSu!XM*T\i V84n9^9>W9);D 6N.rEYA ÅXٰ[ j۹aYqU2'-ò:Dj.ueQ]8R &I(K`#v ֏v @2h:s"sINy¶!o<ĥ뒧yieN>mFԢ׻^ 2w;{Ͱ$~V^ )qFqUD "1(a,*<* ΆB䢊z ]xbJͷD[j  6KTGz+FBuArZ$=}fsrԡ*axbQ/%U6K7ub G!_ٝdΨU WSIf]YEpeƢ@ J`6C6jke ),IXf6XUFAh:.km+G=yhtg l4:螗5xx5-Eӝ oQ򅲤 $|$+UŪA <{"#iHgJ6. _jjBAڱ|`((!oίB`11sxޓ7N`WKZ7\4&x=u40DMEXX"08{ S#ZΓ>֊%l88`}cl&ZP#X8`9@YS{= ѤtWA CˣzeBJS<5Ch bKyӅt&bdz"IŖI2v!!_r6vDMA GtJhF<r8VLhvBBr-)>S=kioҘ[!61nӤnf[)ٶ[2% &dh4#Fl=3q0PYp|n ܔ9l>2ƄYNq?HV0e%pzr. Zi>`SkϿyzpO|D1Q w>FL`u/mHsMLC[}#tժ|9͏u2y\ʖǛ0g ϝH:&<ה Q4HLQ.æŻRS1"MXZ4E𧑒OFӖurA"\M0R:xPm#VX"G\s+81#& fM@xԛ & U?Ik2ڲEDHQ \z-4$ Ƀ̜d"pxYsʀPlCpz{8 Cfa΀PT?Gy> X'E\yzy~-:lӱf>w^ѩgv5X "^5&JI^^zsNr73Wgk8}b=pT-oW1Y9}z]d$a=6ǽ6 8dٗyMa W\ZJ9*bڻ}&_^@ФXW`;X~(B>n'_훧,N"i0zTނY D{q6zξ闷 >fG"a04<{W iE(%( $l "dr7 z*1򿯗$oVJ ړNč c6D"CҦdOHa}X,(09d|H:4zj H$A% 0S *l)DX15 %=cU(KkEqM{yhOZCm ϳcPrZ^ܵϿx~4mz өxs]Ysn%{'6~j;^{)s_j°d{Mr>_T iꗳ)FlQ_/[oM7/{BK.*)*Mi]);PՑ1NI>IMbD9]4F0RZӓU&`vp7l:4I>G+JGIIlbBébMiʑɜ#t& F3f\C@`jla02*Ձh$ C3H7jYxQ-k! \9+XAYUi1 xb5 y#%ıJwMYTE(eS9bL=%S>_g(M~i?W#AlpGFE1ָb^J(Z+/J, _;%9I&Q%kMA je3c(a4dcQՎ!+8:@RG$Hh DVV9CtlP[3ڙ,4IIZH 0=x}`֚3*[Dl MDK5C Ö !Jk05VzΝqiƚxl0! ?|Pe)͖-0C ,V+*7ʃ1R@$x?80WZ{$|]S^k@u3܃qd3Ǝ;p&Ce16އ},JyՏO1=ɑHjԝwb@$*COY/oW@eMVJ V5FqRBx& e riko8 S7ɠ0>Ps˙]n8O8TZ,U= #c<_"WV`I@ao˭b}[:Lw7oXSշmbOusfƷR*q"}"k =%4 *`GsK^jTrɄd`{&f >}WJ,/宔Xù%d`3D5d֔FI(9Q@ hRs\/Q`qr*ټ=C+gdSf9bzij}٬z7ӤaC,{L_K⺳O0u3@_`5^f[ۧG42Lpɝ=|:uvd=^Y7|& K9>2ƔY6Y+Rwz\":h;G6Ff]o]3Y`x~#ln(fF&hU·)ɠ2-a=YW0ţv  R vId?~vtbm=^bmE6OːbCvd8}~.߶b1&rE՛yegA1G ʹY!>՝ՙZS}ˆsҷ!~R㩝 ~<?~f wvoodݷOEO{DEMt^UpXxE/ءNCM_VWO.7SWD"Nm@29jX ~prUXmx,n+֝ZgB5B}yp079I)b˙w)1^+'P! pV|ȷBXRW|+^9_ƱE5b$t0wL,4*(Ց1Y /e9yqME) h9>^5HQ1&r$.GdS+)ɲ| H6ANXqb !^q!AC2GLBE=L|Lw謇5AK0{>`'ˣy>BU6>B1K>0[P5AIEa፳pVMpR_l鄍̫E͒Ri)j؀^xw! JAlMDrk̄QF!X5 }2ixftF W<=ipx/(K%Aŵ; ^{uwBNibVh3F Ù)&Q:vb}jBpM$-c Sӄ x[N{F{?ve:}hlm>vk԰צ;xqˮ.90/tD5sxwJSs"r xb glM6 Z). XI!Nk$'W %qh9^n۷cyW3NhfN&&az6 e4n竫LeU.E/^ĺnEzm+C(P`LzQc%5Q:o$7rxYci@#H r9(LۭtWgof}:c} [;@0ތyO[$\PE'[N6#rzRǻ+,ޝRgCG=7,0,ؔF쁗p"{DR; cj ;U]Q(:`n=PJ Ruv 6&hp3h%;Q;)\9r_?Kϼ (8 IKf B͆s,&u'Y/#5n0_XHp7?)W'z%[ⱢTIϨE5 (6(0Dkخ tfF;- 3zȅj_[vKdG0r~TA{eԝΘIu_no>ш)m{c!(]wtretVB51ZG"'1 `Ywv-=: =륱$?~X_\aaC~yPZ\olш؃,zw ET28`IZv@/E1J{@ZR:$^<:{ʽИ;Js!SH&do9dʗ\itgCdf=(3QjLHؐtcc5*(Xwօ N>,a@&TJ2al2.јg."{JD1vNWI|*d(5n*I k-BRƘ5Kn|O))gx r;Qи;Fx9g;jp3I6+"aIk>l' $-L}z 3 `ln  MΟ>ί,+-t3Q̃x4JGc<~ji2uiS2ʏo6c 4>.gnsK/o'<\Vy#u#!_ȔBdi{ 61Vʃ)v/?b*`ڭ\օ|"Z"Sɧk71mBdmc4F1M:!nmu !_ɔ[q1Yɺ+_"ӉsBh!YxpFh5$ 6Y.KAq̜Y$42^XXSb<炭AGhKfxSkD3$Nt?&_§r~nPҤ=7Hk8aq5+奭v kk}ߍq_\I猶?S3LpY/X0˿ݡ'n]qWW/WW_,03}!=Z`}5/xQ끄ժcvbEXtt;@㙢p)١ 'uj N?jh+%ՕW65+Tá$d٦r /¬/؃x9)\t%CLgRGV ɗ1cw,XijWۼѡ%Sp3g-A[ndz3w(Zq0uZ#vt3M[n*ݦg΢MA2TƤX49LQ&i(^?SֻBڈAutgngm+p3grJ&y(ݔY[q0B:4cy,S(h-Ef^Jni]ctۡM۴tkP69w9򱚟Wo զaFkM:m`PQL& iS[mj|G>]՞s|(/0 ;Gv'Y">$^Eg& ('Y'9ɏfbԠkpQI҈Q':zي68L 42eD7uT.BhQ3"{p>?1O`Oyu܎AdLo܌{VL|xsEMS&RPyG<~}驯 R`o?0j[`Ø|?#uBww? s|s̖|s~1+#!^srvqvu:; J|4ݗ޽3 FɪWOOX\x&T6j}nrЭPC^>)K:{?zǎoo 6f>;̆8Pzsրc.$T;ku(Q*ǫeǡhݦ[Ұ &Rs{E؜b>Tt{~'?̮C,-}o⵽o>٤j!+cK` %TbO7wi"K/*vhfSỨUr|<pV80q́}'lи@q QkK;ɸw`A$4LZfPeot'[MT[MmdX܂ }5sI|m,!AX+ƣtu3Z_u7?RuG]ISB/浆5qJd]uTgyYY0Bl\qW#}h8ޢ\=8%wwon܂=7NkCɫ]d;9u̬3@XAyCVxseHPu:lqDמ:Yz[O'+2.h_Э.j ׏ᕀ-g[JXkR$| %@O>K}Ap'*wgq w2On u\heSQx]*:iQ)r $<͔}X?宿y+р-4mbϊ -f<=w׬ P*sg.[C 8Bao5P!@x݉tR 7">ʼnN:gMI 6 1Q' o q3Gw5A{L5c:O P89y0j(eD/P11YaRN6 L]4Q[%WF3%w8Athk0+7ܽ3KK] |5n*ja0(qt)1Vَ-I(f8]jgh{ѐuw?!\:4DO,< =a!YiԶR66saX%۷̳W -<ςVKjkMgz MҮI~qҥ'J49Uj\ScHޢZ#U^SVb]$bQh@dg /FA9\!G&K"~=AȦI Nu:&ulP p31(!"(Xbh)())CO\9iPb'ÅH(H@do4E.LGY A5Q;MWРĴarqo*AjKMuh؁Cr`^K JlD6Ca@c'R._S 0C;3Lx @4O =41yBmvKQ8L+xBFľ<pr"Ux154[$(lGş K{z~zͪd#i EO'uc\W(Iл9qQ(@Or)dYE[!bE4$FG3dtA졐2+TVSƷQvWwExE8p_]ɓ#pF4WX +LA@|f>!2:('lN"Hۂ:> Bc\Ԍ&.|(\,PRԬ,5KnW# $YHYJ2E,XT' !(Hf6VzS4Q0k]TE傧FXDlPpV\lgP@`ɭ84 :J"*ERH)#KHl=:FX8# ĮFeك!k\IV+/#:Yh5sK&3nQ(W'§XG4рfd6pQm5t\c(^TP^ߩ]/H v{j ؤ3v;8V9gih1 $ qSj]|& Ɔ):WaRXxt j+gk0NK&ȶkN@MqDbiE,BjkXeϜ B9O+d!A2V V[GPd ]Rd01S@R^͎04‰9U8Pj _\o\2G3b%"eb@"gerf6=c-XԬ喕 u Y!oZH./g1'Q&ÈR0$iA_C"1فBp4- ,{,O-?wv3Qvr!36 ѭbq9%Dm߿J2Ҩ1#~|WkɢX2&%Y8BZjoquǺկAaFgq<ݢvY4|7pFMnjT@ qG ;o.N?>m@y:XĊ=._RTҊ5>UZc}SOsc cr*j-,ә{,^Xb І4-ںz_֜߼=誄~\Wk/d;첄VB"`[і|b(uoOG!|^ _H}~LjW?s0~?+ZY[W*z>\DȎ^:"u tZ1^حXcu "tחKxC\SZXvXo֧YaN'fdjM|F$A/ozBjd}MŌDIʹsVmenMC+|kߏlxuçrwtmE/@=_ݜ_:\2ops꿇#ӟwrzr 6ݎg1D!FϮO.ՈeV9i8aCސ|XQf 녅*ǤtJR6weIG%ݽzvO5MdQR",VhY>X"3"2L>%e6'>*VdldR1a"XBt H JcL8JS04lm]ųdZta8 'VpN(.F3Ȥ9!yu0\|tĢ(3* OC塌n6bQ cǜl Ry(jgt1r3IpD|rNׅD8[RW\hy,8PNZFsEVYSsՃRpΞ&:Ak$K-h GMO#_iP8(Q6a]_[/܄` 7_#`"`D-G iQ9.=ҮPqz%P:ZIF^+t&} ɹ[?>rۊL K/2^?5`{//#r,ϥŤ&r. ml!{58BbT٣R*I2mh;pd/D!/P&Zꃮѕ{na:)YSEJ薽"0 FeqOWgXa`#S[[ ̱B+PrFjϖuv`4ɩG9Z79{`H_3ڰm9dn.*Ƥ )^`+$u [Q5AkI@Q˨2z&J xgy㲌6IP$'HQQh,:$& $-fBOQ=$F$;Rب0Sؾڕ@|9^d*@Oogx<]\'4a⯹j ?/o5Z5p3yt񸪳gɋE?sƽY!u0c_@i.t?Of;\q?7wB<Օ %Q>o%[:{dRoyB5;5 3G%8tJ^ӭM52GĶ^l ss;)ϱojznv՛kk?L/^wXrw|vY3h6@%Sfqg?Vn@s,|WFjarzrb}jI9udiDsIG\x野pp|H!a8gPp.w`B'uQBxg(N>8B0i/[&,wΏY@mmVJ@c3 L^/%LXX%Ur8 %>81QfoѨ 5#W{trYֳ'ǧ8oΡih9CӦXraUKiOťO@#)_K.}j {݌QRڞA$kY5J\cf80w${2Ug<9 TQ5XiFH`L(NW29\B12F$^7JT}}Lrch 6 Kú ɘXyrT$IB慐A?'lD-;[=O P[!c]uAo˨eB  pNF3גk!Q*ƒfh%?mfcz-y:GwA7]_::ڥi\TO5l#=ahqSYD3]?Q?ễ#:o.t2WtމX>~HxE 3[I>xΧOf7,u!JjU(?^Y{@[Md * 5HbWhf&d_j^')}PrKr7yYؕ%BVA}+J<=c]h6J^C \u+6#X^y o@Ih8zNv-GltLeӂ`Мp7 NikȚyh4Gv/Z\~F&+$ nu⺺c/Y:%;@ @'k 8,ZkCrh,1kS/đ QZƪ+h`VT"#ͻD%0NyQK0^G A""pk&rCB`̩ r5:uP[Ci\ЃBbqy}L4.+n.-6@G!нgu~ABpĐ9Eh-v ;OfcJ`toGdnt]gNˇۖWr] cOv@-$whQidq\guJF+WZ[1!LIjqܒ'l%z ~w$pbSѼ17ɱ$BASƃ4%qǼޕbkTki5,%B5t_31o;[,b`L/^_^7IVhE hGDCw+)):Fܴv~5!BH[0NR,F*kQ%1wvQdB 8~օj40ɞTGDQy )+w؈wIᙢJ  F(\6Eb+"`"/~ 1'~Imyx/q$19ygRozKD%S4(H "Ȃf ( RҁG."DTtiR- s"ד8Y]Zu,Z2'V%p?մ)윹ޢu woV竵Z@?hH2z*ZWp2Rp|vES G9 EɃaG y\+)!#eL 5UIą/kQm| +f] :PRJ@yhry_$4E+kj|q8) -_[*8m{ I+AEQI+?]SԤE]qݗKrɏ$;W߬d;km.w#"@˫% g8Cj c,`#/(a1vz(B#9FX)^6%=dWB`9O=h_,;tɼy[00€Q&qx$I7r8 9SJET"Wf[2*^cK*.@[\Tx R$\,ERRHMc47#s+8@RPJ98%9))Qpp"t\qQ DYP'=CVr%юUeyRy1^N8LxuH%܌0g1͔"1x0l)&wf_Ndf<]?4zk D[QHCi Oij(p'{z)N莛l;}41J p8{rpTm\ j'&q!G!tHRqvw $vqȨmQ!!1TW~}G걞'6إC l_'cSq+\톰KF)٬qiT(g'u=nВ3V:}h&]/ >+Lq2)B&f%M U^P%.NJIG Ԛ A}bQZw@5+!=9jG*SXQ ܎SCTGjЯ@B+aHD^I"qR냯nuk~Ozx:;j_5{WBN;˯mi<[ۓ'}\Ӫx;o2 `඼Rpleаf {r~!QwijYn=PdA;iu*rN3q~ACeVgb64?z7D7[?qpjr} 斅c_[vg3n7Uu:8_2GɃ rޚ cd aﱺKo{Eś_Y[|OVz^B /-r# oYi7o&?{ہ7oofo`p=??=< 숿2ăzw^gW;jy*?zu(oOr }He~\XA8{PGJ"id>=Vۄ? ڞLT➛O>SAҞ蝯PU;m`JiJbU)hS -fR}#SVk; al>fq->FǶm!Lx:˧zeiza_OM.VC1E8M^&=K# /gx8/O0SrNXY3dpӽ3}F%2^f1sb<Ӎctn!˥Q_$Sυı[nNqI?]"5 AuCA'9A{Vj4ůqNo-yoWs41 .獼k)\wDJޢՌ:4My?QJfpĜwZ]w:mU|dYW&z z);>zvEӦTpv(ڵ4=F8LRiљ&= NM;=ud(/ x9Z|f V ׫;C`0zK(4b1%"sr42=]) 4pp%ops幝q`t~mJh8yt!m0ht}ul.yƣy=])1}^) *&K`8qH?vG$gDv}\%.jsqu$aω򞋵OY'&gNya o^vΔ˜";_B) >mI5cR _,ٿKyhSٌCp^ePFr1>SSNhd`7mwPN$?)n~}_F#:'h:,+9Wv#9ٴH>Q> Fi)98!T|"J|)]l dgj,WMI9?0oЫߟ>hx][}i?XIQmo "u[-s@(xFi@9j2&?|j3móY_P}PdҎqUx*:Ѯ(,gz0g1L@cW;@- @ThWp,QZRn(Q,HILUTJA8qmTAA*ƭCJbIS0)@I+iU $pjbKl1SJ,2d^fRW:#4N5J^`c(G`rX_L) l.h)K Zl! Q;玁D( eI*| )K@/R>?-w ?_o(Y_m_W?n;O߻Yu;6qU#?Ǝ_IÏ˱jm'~A~(/1klii (Nʂ a 3#Kƒ #4w"Q)\~=Ϫ,qOVffb|Rx~L d؞01l첖 O=em ™ŧ~}_V77Lʍqrjyg,U䯋Ǐ7~]r8r"$S>ܱn&!XP NhXt\fs [ p o % =A mnTެ[xLCCB.\DCdJ/HX7(u ĠquK\q*X-u !.A2%֍ 2A m.vںW6n1$E4H(|s׺+Y wmI_!eᰰw؈ӂL">W=!EJ3rH H U]&rՂ Ÿ>t-Zr̒#PS\rM EҀE8G5!CH< 8̭GYP|ԏCA0y) bB ֦24)$Xd$\`( FN1^ !OB%29 `5+  6h"} 1oND뜊FajBXzr,ދ QC0n1`"vDzSIax=yIyr3`(F !1`/='@f6ZJ.${D2EU5.ͨ796 Mñ g߳VNU<0wz6e [3v-g-iN0O䉌i —pl--[T^!LѬo@ 3կf4? sh?"O5hŖh|8LK.^ek\ɲkP>37!?ߩ -c w~wz8X(TXr0L{fiNq/~Km?Y:u~鲝,f r|;هd0\}/f6Q:?ಽvY* Wi h PH5ǂqBymF帰ɈB'3"W;`M3\L2chKAkҫ:pJgB e6q0f3;_-ݚplpTY%¹:o$O)E)&y`.n\l#\1s$o_W7ۅ&#^xһuGmfZ?AZ߽lbo<.Y\*-^fw ~ p"pmiF4Cݷ[?iB:1bϼq33YC~Z+DCӓ ڑ N!ĉgo*%'7x)[h޳YDrʝa~Ԛ`28KkOR mEeEݠ}8 9o+T Z-Ƣ) 7t,<=ךń> _7e~wՆ.9!pOj];_eE?p,ʽ[ϣˆFẍvbA3D踠AG~n͵"tKZA,LY!4qgIp6!,=?Ljq l52qy[2[k3AƋa5BϜsY/MH I]NEYk ƒUoɧhګܗde ƴȴ0Gx^^7:\.9s$ ehܕB%'ݡIn($}]jǾBA=*TS^(L|BT+z4J ƞ˄3dxg  T,E0BF+"$ 1FEDp0.qp G03op L9E!8n>HQUeS3jı930sq8# !9.4UetPyO!&RWH/YZhTPJ>KHOLZYW*+u M.H9%}X[~2M2*y.3 àI1Rea=,lל "5TtV<+G~VhIp)aGB{l|ed$E5,ӈkE8" T2CTzTC.e*3QPUF;ֱZq'vd@3L6*K(%Dێsu碸lo,n3yepX]DW; 1Ⱥ$8Mbe`߉kR$(C$#aOpCb.l$4 AE,!L7Vb"Y)y"''\\5=EqL8baBEVsƟՂ--H!ž7Zm:U#BvnmH-INBSt)4VOvJQ0Q1X#҉ :irIQ6^wdCh,7ߪВюнF25WyuQjRǚaI\YroV1DZ #˽{o!c XB.-؍g0vܜGKdkIi{w3%|)a^ٸZaq^"65{4cWu2vO+M3$4=3GP%8UmHJ/erjϮǔ(O@iovYH1O&YNx`sI➆([;95cdm8Ӓo\BI\h`L&Ggc`퇍RbMCţqf[FaĬR0>iNAJ]A3.b0GTx[|H(QQz8Wbw#\%&el@X`."e Q.@r3Ð*䙎AE=~ʤ؂#=]wUd B:y4:ȥX-e+A@VXޖv#-)G*pG3SRj0'r?>Vjs7O9T(q T !P1d aItNm&[a s*`޴&3,Sxgh!lHx%+o P-R7Swa"Ia=9d'TRP|8>N!w[}J8@c%i 0,ҁU[juLac,$2@j:2AbpZ7p)  %>Pʼn㑴A*"1R+9 E=!@$+`"P;DRJ H G\8 Rh0ۈTx@U@f6Vဴv #9tXUȐAX4\PTƒ{`b D#U@=meaDDkPP^ZmYتA[岦n,XSW $ho^[vzg@_KF ߏTPc}P|s!m%-./:sJݼb܇ZNeU$ eftKzftgz2vԌG@Ȗ]2_l_/8,^f۶Wxn_xWaW{Gg.ަ.P8v^w^ iYNVgWK ӳE:]= A1fcRyFH[ a }:8)PAtiA伟}N&wKvƟ-%wP䠼sIc.L rz'"'#cһuGm͊?!v+Ɠof~op;i(wn]HŒJ-f2"㒛 ~<J"Lj  4OD =q#0K=.h=sFF++|׊{iV~#cp;[Ljᙋ1_OV+"z]"+n{u c>Mнq<+Bt_&34!h.qZ|%wm`j6mD_GjF:.nCp8m$ʂ~|Sy@C@,^,v"eВ)~MOɬzfKUOװz] 5E;FdCB6+vbl O݅d<ǰxA:]|Ue,&Zv9VgFGY90ʙoo!Tpz^:N'D% }N-K~E`~]Gp\}P^>̛̉QC ^ʈa^QBі"R"8EB!X&`#z43Q-J5RrM)/kfZX@fb4et7/wu/ɖ'FO'\.INgO[ t> Ȱ|4{>bV?Co_/W$$ʧ{k†=<1L+\ЯG =;,#;MBNpGFqG m0@h$QDCyT;*5a[Z>C}G hdZ($K!ڭIBtSµމbD(4=Iæ'éX[ ̔=J(I9*4h%׵QvߘTJЈtKꢨ(!&C7-U(}q?mCIzf#2BV vT7)+S|wtUY LtuX1}# rUh>473u {O~pɋȅCY0Da"0^uP|ד_α3[߾hc٥hyo| VY6\#IGRi%"B[m2RP'4i٣H`BZA j$8/Bs Jr-i6pU1p|iT erl "wl2|<'2-ɕ3lxH +](GC@G(nSܮ&g,´oA_[Ss`W4Cg:;X'҈s;6F~[scç(K +SG0M;>ȸhR{W$l8)o; Рi5{0##ihdQ,Go nnKn(U vk݅ېBڞ^|ؗt"‟FֳmX _u ńhD$G,2_XHYsl"BIh֙DMus }:$mѾX_zQ*Tr0O]ZlYoG׫|{]% svd[vXʕyӊՁ"UrJ0 3|z$BJ|c'<|S>2=y+ˤ&䋇pX$pImd>V["-քbE_Wy`S#0fh|˄jRص%hUnX!˙=Е+Ɲ=AL4K6,Y+Lu8gm8yRլK84s؋~"u4~AETLKX(&YWߘ6cՋfHH+NAځߗhpFuJ:U#J9Qb1!)E3#t*:KGv*ʔ3F-&ݖJZKݴ+p>MȎטRx`TtB|^\FQu1y)#:"F )N#g,(S2"}@S:P@Mh~ BXvJUd{8foګ&Dwc=E%;ij6`^i)a{gEdѩm1_τ1Eo%RIS<ٞ3E^ZfKo(` ,1V6Zg?@ϙa%f$JL_;a(>?af>$ҝE_;][6Ձ_9A8W~WhEsv2sV(%NeoN -ίzN!R=_rm.65$q 9TWh.-o/A5.cr:~\T)f҄+zŎKPg%}qؿhXgmsq@iIGr7gAXGdxd S=]G *)_D>TrY96~e8T ȥU׉jD16.M&r$2EnrDm(A"铧zvc2 Hԃy4w~X广$z\4[6mmz(^^WiDt4t%Xrei&K6YҼZd8.RRÄRp7!לS(aA{,?,VcC׋[>8,/HSu{M'XP1`A^(gY9gov*Νo\oC@gUt `G@$gn*4aXFPL,b׀& :cքbp3twxT> dS_` ;K53L0N6TrU`)3H|J]mع-6uytngZ 1@%{iC,jEgQxȏ}'.Ҕ>}_y`Z$Ӟ.o*T>|OO;#3\:ؗT!E9/7ea1|? [ctQ3(6h&hnlF0xl+lO|Fݠf>̞FIfV," X;B& p>|(Tېe7o{{ܞȀuۆ`(tx6h@'3A᠂QX Zn@@ˀ,W\ irAj_B:E)^4< $b%؜E I1 L$q[;ȇ_!$\}v`Sm/M%p a7vӾ"c!Jc x\n7vc1ъh=+p<̶NK0Y[<& f>||9%j7g&i)Wl_?=s7ELJW~i~ Y:+̱:r 7F(ǚ 7@2%Wd4T$;d4K;Cs0L[FpeZj 53y3?a%&d1O-ZX̳)fթwe4kCw3Ó kj!Rng~0>GgJQƑd -%/H9ҹIaMR7!] zgsC(9g!Ak2$ppo?w} e6kԘn dh1`B)-^wVYO>0oSpa I<[,ee[L!/"6;Zlϖ}Rt ]CCS,篋7)ehl gnA PF}N,yoUo#PZPr̺5q%1F}tމXpE\&{YqZ|PmYrcm\3½A)kp+E6҂rlw@( ktBٞ„+BIvX;A'xk4~8F^T=cbLŤq"`_Ȅx 5BYU8k1I"ke= E=`-:D@KXF7E߀FdZP ,a&X[ŸESؘ(h%"Tk@'Z*PKh1^ 5RX^h:)6NeAoYUe맟2olrdCv/I8uڥP9)=wm<?sfRCF9֭}~[`tOg/F'J!I\-"1)&^3V.XYezX^pO|NOM֤\x-qȟETK$MP[SN9H/g$I ͕nmpȟEd[;ҍ`_tki:G%+]K ͕nmpȟExJf0"%9]ؕnJJBcLSoݽO htp[=xru{nCS³egOM=ܿ͞0k%ZȉfD>7{$4z8v?4@2XpkF/$ BK 'g/>G*-"@9HSFޕ6#"e03H Eu>l/15yj<]55d;uYL!Vw+$G0hh94ie5q9X~hN#%hhx/q/%Ic@$f%K)a#!jq"zI$yӚѫfEmcw,\} $iL{e|3Wd}qػiJ e$7UűVҦkh`hy|ъIIJM`^g 'R>KZ^~1\uvo)Lԯ 3fEL1_A>{xfv ]iWoɻE 5Y/M;M>-Q*1&0u|YV*XJX/*.PhbGeJʮp֮VwǑֈ\MpIcjg<BY#|QYo2t8RH=͒?C%=O+qqIW;Iq8A]g;Y'C͛_-Dk!^#IʙGKmcJIILD$EA㝧R #_l 0!$RR//YKӡWY{[ãA j9&@E?}⇠i6KtԥyWBc5;% ݦ ՟:Xy[m mϯp"c\A+ӏݿwE8o(!W5)̾7ק?DpFO4º,[A]{IK^|ܓHt+ nFcs}\"ы?61*p#ONj¢9_'s#}GV>3T#luHܚG'pM@$v'"Z[׿ .9.gxtLЄ"Er{#xPBg=O΁nܒWohGp{IAxd nqθ>{P$xqD ;ќ CdRե"]iF̀ cՠ ՘V[Ǚi 򆈺ӪլԄA҇5ȘjrޟU\эO-y, L>ʤmzµ'!2O0pap$^ޟFR&趹uZG9/2NC 8R9"PS&c&[% tZA01JsÁO?OYrU~V&&5$܈H0K+6izGC6uy),ZW&[q-d%A`Fj[I [ RDC5Nbs׫]OEx t\ֻwrs16 џ $d0Ŷ="z<>a5wIoѶp9Ёѐi 6(8-hv2.7,*qfޤ7S 3MH`InK͞H2 `vYGs'!=of4k>sMA7s`؁-$8N?|~:$%慟q"l?>mtI\sԧ$e$KMyyoI,RRQ ;x?AzC0!⒭qԵ PiJAq30"%0FɈR2/-aV&*gZ`eV801?/sܸL͇VV/ ۻ|\|?MT;͞2Xo.?|lFk ' {7y~}ḻeXy+4svu [>^s߅qeNF񧲗'xW^j5-(Vr·Q$XPOȩ"*ex% Ad3! F+Gp}iu 7klPc`btcpV%tƿ4Tp"WcF1,ƀ5fxpigH0ED^zō$8h6k}jcB1 8_1LOطEj.X /p(7i\|ZnĵKaΆ"3 NTmXXMszk?I+yW)0bEj:ߺ Yw2-w ڰsD#WO.2}̱0wXSMFǟ9fy ]3rd1ߢr}u 2Mڢ@T:m}7X}ELn;^)-xGs+)9!=9T Hwu>^\rm5FIe |Hؽ #=6\/oUrEqcggbwU:"<#;橒v?L"A*"gn˦ @Ln/ʍ"6 gLfAJ迧eyA3[ wߓѴ˅斍.Zgo p~Oi>{?_oϯ+.õ> 6%ĩD)sAa7IzM`2[yJ{ѧIXt6j[r ǁ=FqZLMZ'_FupÍ4y0+@Sn8Շڵ48;oNOnE:ьI7#~S¡aBkcj!A/ҷ)L & RE7˒I]Sqi {d'yprAN9i>9ԷZ`43s8CBf!:)il N*rۂZmsκL)WStZZ,QoZohMuv|( ,x ,Wg0F_=kwYr7w8y5oy$: S8s2hج Iu48۪?`(y'xI9 LH&TH hl<fPYƒX+0Z3>ڰNwȏ0OVȿ.r[_ƧdM$M&dGgXsLcڙfi.fBpnr}n-=߿{Z4Ο;n= gF7/?)HyX6Zw [ 1Y}KD=fBNK:/ojŲ~hA) 8;pPLR ]|Qrr[0 ˩f5+ϨA2MAY%l.Ҍ$e*k2H# ˿d-顱;"uLb8ȭ˛@Vz(9Qu}?##co..?mzxHAr';yN?6Ws_?}LB&+@ԵWeK6-(씴q"Kڶ[VܠG} 3aqz1Lq`+ްǷQ09 jNZho ( y;Nr.JjxB{=_B+Oz2JW(N80 AGÎ1>Tt"qx5Snmޫ dHFt^Gs@R"T )5>ͪ( QR$>%3G\U #CZ+0D)mՅʂ qBUNpyF$ %4P5fB+QޗT:k6h;0 _Rt5q W$& 4Z`7Q3ݘW-,S|fl}+u*FS#`Re4+@&+[jlc.%5FD}O@ijN@*o !eD@pWJC(59  FKIF AA h[6 E٫8N|xgo߾ZLHe+J4.ʴрN#^?ϊ*r5l"f-#fU AlTm2JnG37Aul!q6e3Qsx]pO~.=h&_sg%ejj^ieX %̲`~ jOԯrY%E>俾i&H%5XSɂ&/'&JHtNHQeɨ!l ڑ.Yӈ 7Bx5̂[_V-E3_'xP݉a\*>?Kw`$y'Q[SPDI3e.6JkЈNJ I`˼-/he?rP/ݚwwi³j^G)'"%jh/4ZI6HqCM‹F~š]q(jKo :}׀h͝8@H<7SW(2ah*XaiNHBΤuVel*'zu!U4WPdRnjAIp6hnr#J2)EQ{.M7ROPȫzWtYїj\9,bHqVp9 ;O(6^f\Y1gt&"f"ed# a 䌜୔k$̠OY'I^v/Rȃ#[!s.9Oz2Z*(P*4?װ?.vT?I>b9ǡ'}\~ Rqc9G)J~(U'!W`ݜ% CE@kll7Zӵ#fUKӟ2^xVTKQtPfDr ʍfmPxWGQ夾0𹤾j p )F gkKwQ1qd x0fJӈy}=O& @2{ 5ZȡOUՖܻU;\joC?~ݗf&p2lYZC b9*Sp=dA!6YBiȒ͒jc(g0K6K8G%mW~qMԇ#ވE1IlE(v2li<#0Dj~t)cMᥙ9+=ɏw1 V Z PÙci>B+ePgsd-gS7͑0 m{x9{xYLBҵ[E;-"\o}fd}<壿l`UoDUB5[%\Ғ(}xZ%e~8,U.ˣd~KQ9^Wڛ?.͛Gl7X g)D07kgr-w8 ϭX3J*>D1xQD5i-6{-*`Hޕzz#=RIDB@](,NPqQMYљLM 6E2 |bqe '"Ln'HFƅP voJAOkb~~ JVisNn3cz@>vK u%ԍP7^BT%blr\@+&DGB0KPT3)nf)=W䏫^;k=M_t5x_vh=N|LW 5*GWy/ۼ7G`{q#cC}[RRl\D]D+PTiT Gnp_2jCjQ4XtȖTwپ50l*>i@rRE"Ra XF s'G̡K2[6C!Prr[%EC%YffqXfs&\S $ǏfjZc8N7\W|⦔1~M3o{e}|`:`-JV7Qް2[|][~~Qy`Z=j~웠o*  N"7)M?B?<,q4=u!3 f(L_̓o}±fP._@@87]X>̱G\^6s;?R*܍Wn wSU֌1꜖\r)J.2DŽ.L*Q,%x"CNΕa:kfX×o\෠m+^ﵳ/p~ޕ֑#E/gU,|ڙ,IN&%X#cغZO ,XECdR=[J م3"FRd9$I{ jMe^}c0c_!fN'[.i^[۬:mQ2) FEAٽzf+lʜ!9AF^Q ޏ|+~HcR8d=t-IjtTJ@@& jp b&J. Բf]ix;GMpYP1^{̨)Y>z@@W+7J?|ߖݻ@ r\yhIBEI J> 7Fx(b||S pA8d&=4HjS(`'Ǔul|;nIB+u!pB6ƹ uyvtCm@@ݞ6ʮ<.,)2OewowCyiE vmxr՝É`)IhkRT2U \%ϡ1Tas7SwGM9 ddgjshk'ݛR.N? dЛb2&vβ+lvTu{̶ٔ:c%KeDN:1pu׏t`e6 [eXchw NY{=sWKo(?i?N c|FBvE[N"v}N>kqQu۩Es|9%#D^]}5Dh]ϊvs^@v1pXa] 0,LU:9]/CP* L$Ed@Mg-ZST5u'˼([Ed$aQ^E呝Qly;Z1yC[,oSe Kɀ2)@uykb jMe 4Ɋo 쩺pl'b]8)| `y!. UEt$4@ɵUhQMNnf+2Ӵdffڎ̘aӯOG- ۶5"M Ck<j`}"h)/CpQb9?>Mb50shb ِUQ`.t :2v'BF$aL1 1Kױ5E^MoM_Ζwqwg/nw|p'4 cRUYԿs>{nnlCoOxwlZAz7%Z;s0jzyWGO:(_}r)1yvS&7,G~74A.@:̼Luu=|z[L`Ik[mݺmt$stXUR*%kMvp$ 3OP g arcE#>AcB8>^wśv}6dtGX>xd~\ {ɏ>>‹ʃ~j}FX=L_28ٻmۉ sSNT9|fVAQ"^LbUVla&yrQ.Ibs,ɬRpgtS.QWSFQٽ CQEǓ!0QpJzvJ?S{سjr4SW2o>y[hܣdԂ{Ny R ;%Qz{)=xCM<6i!hmt̥TsmLQփEĨ$YJA*F:<7s2i?J`R N$Gچ0N @Aⓒ`v):Ncs9]Cȵ=0Vs`]=disQ=v9cT-X8D嚯NƢkjy7t*<0֍{(H|6aZs[{(4)@wd6I nv{8 Ӱ89[3W )8JQڸ1b6^,Prw黫=ښvQ؁͢Qَg@QR[~jDK-m-m{\MKk*sj ]u߯)B-o߯L1Rɹ{zQ&_ӰCJ¯vpȸդh#r-ْ(쵍6V1C0/4tc4:^ޥ8HJ7ClTۄ3<"O>dCd_ޏݬGf2a"{.r\~z:RJ!i>1H[f?_>_[)}g c5lu!z fv32jŝcQB K3er{=`]]uTviѐ1fx<1~LeS: 2#V4lA>ĨFBHe@]JAEc*Jl$-SтUl q"#r4QFa>IPJG剉[G,hY/3<\[h!h(^oK A,K鞕$ 6~XF2FFd`P!cRE0$%!;*.I,A9?EgCS (q7t@ B4T j+Vd;TBި +,IQ(Yk4]sS#ҿ[ PXk:; ]PЎňFc{bO#O֋#$(!?ﬕ/|]PYqjѢt㺇R,m%^/?Q^<`6)XUC}6'>,v+?|.6$w!>+類mQ8 \$]duJ%.(DdN[ScBgA~ec3 !~:r[fӆ1n[R{֌d2m|Z|LA<{%5%V(꺑|ނ~FRDC7(Ҹ9϶Ju}?5[V2Z'<9.˫\cW[i?x{ykRzd`@o~f#tXtՏypu~:::8N_hemԆK䶒^ԔN1)mRV٤İ*HGExL i&  QrzGpNAP\*3L!IӬY{3bӬ=3XlJFTڅFm45op\P+e-dUN IVI;)CٟFb!Tpf5xܝlk6,1ah$mty,h[)n>0& pv؛g"\1ݛ:owwR %?{Iesut Qu&Rb6qV> llWR3=Svi<~̞7O yڂǐ)( )Ǒ0l͖2 cc,jާDRxJG(R zs! q9TQ&W*wH8#d8Q0\Lc&i-3S51ygh 6/}6ݮNk:,uA2$+0j T:QҔ# KEb>dYs+$V Cr/|cMUq,_.|<$tojv;nŒ7o&c}=ן~<='¾w#$T"_B Ǟ#i\ǡ& 0XzH ?Wj,޾(Z|zfjtkI;ƍ}AuBIj+^nJ1d-%R,>oB[3Xa5C*Kcx-N٣dDٻĿ_DV weH~`S}P/]`gѳ3ҀVYTlWy0}HVZLf\.F-qpMN](+#w h82_/D(DMgWᗁa4]僚^^F+*Ea]n)KK^,K4;+ҘN G?dRױ;xrHVWng9~;5y8﫵_n\guD#rhC~_F] '#Mԣ2Gb2rxmF:,O翅k:D>h>y3R|F+2 !gTA.P!Ғ1I ^PJC++qHn8kdqp{llE G]Bzj`Svj,σVW(;ac㏼UHA5n릏Sή!o@ :R C,ZO# "*DiH c$0sPDr b er 0`Id0( Tb'R-78RԪ^w-h)~2p R6bkzrq{~,aӉYƷ'^Ef|?3Yž6vZ? WXWL0@\)iJ{ * 5tPbtu k}!frUz}&<]P'/fpu]ݹɟ+鷿55j6\hH(O} OQAeʂC fDܓA'[@jWy^2gl a2h[4AC%G>&oE0/  -BH5s W3YYzyOYHWޢ OI8X Bt9;hkD mZ$#5$Gx֯]a(LO2c]G7m v2ZS=zYO.BGs .y[4 IJ-N]~>W\cpU˯['uJ#i.뤐 BC*8%"&9 BVR\~PVU[!%^Ƈ+A#)VΊI홢;B&vZID,5i&ML! rX n#%5пYMќ\+j7;3Lo4VRmǧS3$.V`5).2 A !NkE$(hS]Tgs|[{7)%tv,jZY7.hԊ %vz*97I-uԾ@@EԲ9-j= sc4TV)"inu8Ev(|L-'Y7XvZzU_'}Pjj =^_$Nxhs]Kzr;p6PQdƎ]T\s6ugqnѢYTEF51'eU%j}JFk1~ߴ[>cX~@ PtG17f[lrc];41rHAAt}sf1 {9gL,~{OD̡J_*F疕 c  zfK|O|56CWr?̔N}K HsE^~IiC3Dm6^wO }ɠęGEnRmo!dt@BS^co/B @B[ Ͱ.|c .[SvP}hC{ dWg4[FSm9b(JBHMws(Vcb> %'1jBcq")Ml(.h`]uaU.G2؊?C6(.7〷>C.vXX7O/~ V7%D?R\.vMv^~b73 ?-ͶYLY_1o3[:mޥۂuoc<ŚeeA# N,f@.4;Vo^mw7Η2]@S BFG`m!Xl~V G A*}$5~qP׃K̦?g!1NIh~}iUܭĀgiW>ᾔN짅Q5>lrNyP}3b}/v3U 4A>k㔵OV݅6' ogzxLJP҂h * /%{981n)m2:0Žgalm<36!S!|^EyUr-:7 ÝW:6rARז[ϸ5A\*k.=wZŷ& t^LȉT;+%ŗG֧d&qe$0*PR RVXvʵ#P)5WZq$" (D9ܖ#D\R KKi25PtVv ,֪ng9#*31jUEMոIpjrT@D"(okТRТsoDA4nTb *l겈 tR$FϊәKD$fg nl >A\d&@=up=P/ E$ ׃ dxőL ߰w2#0&drxi:(C4&9jSE&SB94{  鴚AЍ{*XP eJ8~uhbde)ce"wx)v-@D' a]rqA[GD&Z"yAX/,'M_%AX*!3tmCXv&s5h sz66'>N!em緮")HVzgwzv6oXi+b,:-5fkI_}(Dk)4m# rP\㰐 =2XRFB0 0PL[,+[.=alHYjuyh"V/uP&)DU)WXc o-TB8eG&\0޼U z<-9K)xg:i)Ȗ%g.n,j8PK6Fir%lݙ!.6,lW*'ANCƢWsBi+B=>@ dttq(ex:xaB.y͢3]Z'!XGgGUG$` k~F~t/SkXAa[J RX[O N Vh[`.8(8! 6K( 5.̫50WVZ5 ACW&K[h|9 S ZvNlCڗ9J确 ;N,"-&-n/,bΌ&@O[<:}3.-ԌW0ykA%H8x40v53Dz)ui=D"}=>̄zbNs܌$KG^uԮ%;Q  2UO"0-O<8{2<.J<&>Ų e %,0g9F-8s׷sV iṷJaY;g~OkkזN;MIsս|~hq+ ]ۨXJniY[E ޸6:ba#uˠ ank)7n Ljsa;;M]8Mb ٻAGVb_SCM :x$BB ,dPXr밄9E;s1#C@DDmlap-=V Y?)ʑ6/α:jhrt\)۠Zf`DUh-UnDr~R -E8>=3_pIATM 3xXC\YEMY]j."wxw‹@ L!vX"[ :wx_I1|9ZDqdD>{UAR["s/9)1m)Pѡs4kGRa\zL-59@DYxE2b@kojY_[Bɼn6Mԩݔk43c;~zd,KJ[Q?=3ݫR{}.6#'0=ځKĸRk[Ӽ!<`2F+)t*DNф#*GDUŨL<Sh5J gbrX8N47$i0^aD &y"L lF=`!W Zi1R^̡T8$d3aRIT\6|"VIR EUغ6h6Qf5lMF3]6)ï@XK&"NM1#F&g=mw6#K&Ct+$\ $.bU{I+AļꔵʗUՁ:h|xɗpAsPU<ЅcHʓ`7x{-^ !*՚5fuM1$`ƱV̩k2 DeӘGF$B&$,F1Ws ¤3S<Uc^٩} CHSC8a'BBC78KQAX*KQ{TfTָqe*oh }$aŔh DFQ9dc=m-MZ6_q45ֈDaW`hHDZ #]䘔1[g\ˌyG[/c}C"W {%$mLB4$[|%#0;""})[q*lAw&+ĘF'X!zd]$mt8v=}kUz{@W۪k 0RpeqƟ.q!+L{JXK~Bbb6g꤂"`3^\h`ITQIG4uqTD[ÛS_oMRiN;c8˨ɃT.R TZ(It7RJ& WgWG5E6YL A©&tm+^! }P}HNK^aVtZ'0Ub}HX9eBDcrtV U:wyX \ǖ 4%)A  Xz3y$s%W1#h\By"NHD (;!e'2象;0,q2qJ*C60gK)V ,Eۘ38kؾ܊^R^H ؄Kxƚw2! V0#h9&!p@ZF`H[h-r HzHj&&D[;6TJ;6^B;Bh40!vyc1F)Wl%YɵbG1 B6]\׳Wft0nE=oVٛFХcŠxqn6 +sL7~BOt3MW '.onO,]>R~s6ez'M˯qWn~݇={w}qkvo~y?sovObzc}@Cun?|]?(ۮz1|=N{O.ݺ܊{踵m}<ǃyz%Jv&-t}ynߌÕ-= !Rθ==1o&=q~Dh sx~7Sh0cwޞNfz_k䳈8FxTcs$N=`44Jl!(0buOUs}xЭ?S2jg.ÜNz;Iٗ76|9%cU/վD~}07!hL;ܗ>N^|`Ly;{wkz[gD188\0ܺ1tT `=p2ק~SN6w1W|쮜o#xyl{kx\{|zx>)?}=- I3 u~|!hwwn߀N\5Cw{@>Go=7>n}uOCx7Ko|Qzjf'`0e“㉟7쪁5iMhp2L/5iO>68g!\tg}3˞7YG@3ϾZt ߦ@Tkά,N/Cf49N˔C !8{2tRBfgSQ~:Y\<`+EWZn owZtF MY @(q4P90F 4RS_` 7pfk|h +T*W{#{c.8NMJ޸C/续sX[ JBjq/?C<ѠߵO$RIѱe j+^d>p0^<2${qT4AE*Xly8N8'J+"p7UDHc0^qCY$Ոybcnb!Nc4H7&"D@X]D` yjM#9:}%Tih@aaX? =s5w5#1x.%Ʋ8Q^kL |I`PK0؀ 6`pu`6@Yj)Ut4S *D$D[ű6dsay&0H|`?8T+p;J0%aRcj=EDULdh1i‚ |QH]\{ l`$ !daL1I]d@a &D1Q wDH )w낁g7Lz=O K1.,]fi7K}~,-l:^,IK$faJWJW]*Bj=-ѩKa;LEusY>uX:FN2K${245:R5B-dBvB\$5cM* `bַd .d'29O?t&YNdͨò#FDoNFA83/+,IueE`lOOL䝛C)~L'9ޝR-u F6  pH5| ]+cPe VgK R5vz;Ks(74=y)RN^Owot"`\0FղY=5k2ю(my5|dmN'YfTl1F 1,M9 L<()h4Bg,0p BE(:K%j#‘)u^h8xS9x)xSpC2p!Ó_.[ΰ׷݆xUCȠBUsޏz tfmS'X7]B?3`o1/im$'(dBL}3D4[TTi%zKh}dm,w^+Dɦܤ ̀9}>;|G!N]-B D2R 'H#V <^쿃=ʹiߋ[iY.K76djuXX}U%4 .9:f b+p+< Utj{ Bsw $ߋ3r 8EkoSZǧ@#T~L gfj#Jdn>Hd (Zίw};@ Yp_=|wm`9s{o0i]lIF[U#1ؿNmQ[!GF$d.ێ=GZϑ r#k>W|25RO\vӯ_ cn>N_fH,})wͩ|(lQt-^E1x(\F5Ǡ70#>r-͛vː:B*U>֊KE㓪h^Y%'SkR9$*&*=i j9CΐMQ_^Iź'T#wYY[&$BIc OE}nw8ܧ}o#w0sز?%[OϠ%Iܬl|vGJe~y Z\< 9ZtZ5I #]M֊^cҺ8V/zpŋR·k^ r#Y^5G(y=kLZW5EpX{Yǎϱ^Ï bAqMRET9i\p6W'}sGn[I7ؒ} ?P^vv1Ov<-uT>Yi6Hyc86je,feW}s*>xqt5j]|*xJCjgjx@d Έ$*h#}caFʃ rXC]<\0"rn\f~/sR2g?V_t//-|UΥfBKjmr $1' àuVJČd8[F=g_}eq8[F-#Ζg7l['qcC`Q#&aGə] RF\V"Ʋ;Rtτ2?$u 8 })jrAJ?]ݿ+QEuuցO8gx_o:?W ?)zf(V~j}ſghcnw}g9>?Joΐ/vy?7~9zZgՔ4wد&YJ$.@^tK~^%AOʬt"0ǸcB?9-5AcVlpk!(*>y):paߺf7wJ;j>עU0ْ, }gTDf 1'gKS@}]B \ߗx *lRb<f,%6KRb<\.Ŷ،1 klGfr&m`ocPQflVr9hOpo΍T6u2m:՚<4eXjkݬ"7j1TX?;xDrO߶7 ]HBr܇#7!.4LJs*xղY=5h7$i@?ojad!džEYƟ||z~K)%]fecv͊Qv>1NVJ,qIoTHWj42h3 ,U-#ײ*^xW_ 7e#ʧ^x`a(1o~= lg>|}k(||ϰls }RؐknnȄ\|3DOj(**N= 1@K2L.vKNZRj,O]Pt@u7p*bz%QYg R )[ /ol،OPps2}!0jS\bq,>/:jk@T5J9q#sBƺlZwlf7q,u g>Ks!s@S?_u'{o VUKe/}tYШW[YxI"Zr_ʼ/tX-I w_Z'^>tpʬS?$b2"vH@ ˷48P{G U*V✶LU byo) *זlK+Hi\u)mu씗}p 3=aeuYzj0 %}z2JO\'%Pb0;B ,ӊfy)ޭfl4.T 쒾*`۫7o!m' Ex0[&%|rXwjWzQbo(=K qrխdڧ?VW9'8l NM?.lMαS&#jo=tO&NKn57OӇVcmTrLn4) ϛ@Ձ!P4n\mn4o:V@$VŘt16nIeg;I{U,wwfF a30f9pmGFLq5K,>hUa9"H&KfzY 7jqiin`!n2&rct5jFT!#Lf3r8#Z @Vj=˸ =CAێZ>1%kI~}鋸n/4y'j0p BJKՈfBe+GWv5ճ* Tk셴o/?só,|潬hk"+28\oECT ܫ~N0#r:UAVCVdc-E@tiLV,!ڝ@WR?l뇐DUl sޏE odZmeΎlrv?{GnwHIdn.8 r|Zyu=zF=R?ێW {mK#v׃ŪbUx;ZY!X\Pp?{ rTJ2p G"{`QvȺkg UoO'|/5:'@I'coǢջ`QBz9DVJRT0xD'pH\?R'kk@+YD`G bN40@'#:ft#x{2u]Zjeل^K >4X3Ys<*J$v7I͆u74ځuS%7Z>SxRITj+TۛWnuVoݞf#Kg_/?:>}Fi_nzⶾ^=OOOW'țw/vL-'/Wԟ6rumyۿsW|X;A\&\_^M`2{f۞{urÀmזshq$b|uҹ9tV,dNY:J&rt◣hIÓB0 ۖ6 (Vez DTQisɶ6%5 l# ~jB`pQuiuIu=ϧggr*_$3ԢSf[Dm ǯ:~uY{x=[{v%_ZlNʀaʵѲmLC_Mbǹ;zR#805Fh Gx au& V;k9(BNF&_J' >xe)GN}3BAb7ڢ8VRF?s?ڐkXs"l ]sBQr~ᑞwnܼX :Ҧlam&Ho?hH#C4Jx Gn O>9'/N=H&;ǀAEoC`Q:ܮQ|[ȞMmF# ('>hiGl_ͦfzlfm %j`cF,w0@n]oOP>OG9b%g x^>$TW;S? xƷ/I$P%!s^3߸j,L.H)'"ma;Ĺ@ٞ7q6)Is9@m'68:xt;y>#b'yv<[q4`drϧTɥ. =kY+JNW!"{ҥR Aj.7=_mzTQȻG> <}:', {39_L1CEd?f! :sgV"CqHBJQNݯ.R*s1@@Btv33333TK$k]v)*ư{h |cuH /2{tZo>e{| uG-OɋNќټ iA;=G\~-[ϓB9w Ǟ) 5HEыF+K5&5{ ?M<1ȦRBl]j'#ٽ3^aseoBΣoo_>ry.6ɹ#y.3K9,ܔlIF#pS&LN8Uv.'*2iO*^K OƠdFG- sN#,$*T::͂3ELܛ6N-1hr2;DuXl<~|E9oa-l$mI^ZhK$mB PDpX2ޑh΅vz$!!:3>4D115# t;}JXa{{Ri̷l§+4h7ۛ꼽ۛ꼽uޮWN:BWŮkIPa4N~3U0HdAYh]Kfm})r{[g%3,>j;dY8}b9m3tͦ}j.8M>^<ý7P ~y5(۔9US V 6|>iԩ}l *߼`QȠ>- 8>ytO&EH!)6Eif Oh8ּ=[d#Ab35ω{q^V`JD٩۴i)'vk PSP-:J=μϿ|pD dRѱvFhȗ./E6B'S f$0YN{# 2&b0Xe*Q'5;LVL `pyDZ{jz԰J;CEGEET\V VP˸:Zȱ@ C"P4?ջr!LoɒpGwx`x1Ww8#C[v-j ; S{3Zw+&m:3P4'1$p:cȰYpá]bJR!a7FKۥCO>pbްwgSjh5㍾t U8CJ͞.a$hC/iԠt V5e:%Ts_/ZjѭBjC]l4%t!2J2dK$CgQwȁiN}C &YDbt9>z4۶AzwXT!!4E/ݭ'yMȁSܑ҂>#(vǺES1U)ήTvX8?!u΂Rb5S% InJ0kQ.` ƫdYSxq-dJ05*~t ۧ.9N=;.~>tD17i/kxY|_t{?jNS;-{u2^r %w ~Pݟ^\ݯDor\J;Wgjqcb~2B} -j$JydyU'm-984ӜXPNfugc[0{EAѐ:=jURPJ#ag V[Lef3X#vfXkْJnyx[Lt n]O\¥s#MngVլU3Tv|v2'IV]=ڮ~=σZº5~UZ)Bj֢ 6òRv=SGiMRN/yڊw3'kVw8Ppt|ҦO@#j=Ƥi6^z):Y`] -B{pH;JjCbx'.pWn)#ە=Kv(˙aJB?+ k-Dn~Ǵ]Kv)˭y x/J[ujɞ%l+#RsU*uHc$yI{SN~ui ^?Ra%ۡ5G\_UN|M6?!߷y?]o5m״kXR?noG`}*}QisN[Tl](P"ۯvcԃ!ϟ=z&ksJ/m;]MY]wR&I.}EJۻk58Ҹ($1sꛠKȥZ+%`<7 AU-{;uZ=]t,aaOwq{VrgmXEqna~ I)NIۃvKKH,KFhH䊜<^zy_yƬᗤӧO=t*Znx߂/_n=,^k'peĪ /NW7 =,gj۫@j//~''f6St f h_ l ]dgA.ck>U 7t;=?PAFvY~X? &3-7Scx߾{s/zM&+|VGAv]%rkLɬ >¬Ti s`QF^rٝJƷt`G"_[u/&ڋIxu|@WU&i~)XR \O3: ~hªctY R#jWD(lx3'ZH$"T/m$U)J88^{,sXfY%yӘ P|G!+|@z}؏Q b;ЊhyTCZQ!9|?$h;+-p'?Cf}C[RCJSXZNb)J!YiFDIIPddl-̅RO|&rw8]d-T ^7$X٠n\oD6*IdF8a% XkOH{6QQeR[3^*#!o7[Jax;SPze m-I,A,!^I9y%13גUcs/"2ӆɑFH"%Ў#m {v TW"`ѢJ +W Ó16sރ"QlBR5ɲS ?F׸r `ͪ͏uLvDגlop3oHK =yNmaJScHϝꎆ.|Bԟ\}w>~ ?O)<)Z-?'a5N_̳r<ݬ)qX'̅~nQ݌F!~H-@v}9Er~w`IStI7sr5>ˑnpg/Ac0_mn͸ًոkNI>$iwtq'nR'wp՜X(IEx[I;~btqՇ/n  9tIם|HBޚ\Ymxf GC@AM~?`0}v />Mpu߿j/ulӬcyDD ux]= f;1gٱiG(` < 뤲 w2' l9JF~S zKﺾY G`h^(X.Z-%/aoyvpWK~?n>oM;Y{s~T߿gׁM09G_!-&}}nOF7,c`e-[R9? UnH(slh@𭟿@=x$|LZ[3:):-:>_-ŚxT~9{CeȦ̯%T..˿G~`X((^-gz^}n8P؇ @x,] w/ @:){̺7o~\H{t7hn/?b!n0i2S]{ "#/z^X63/O|8`ʿӸ Le`#.Fwf}U8{3z0Z~l߂91O7b/L:r xr|Qu!5wyxݿfO(\|t\>?yRTDEF l]QNΟ"%a:G;1;?,#0蹂"퀍.e:yLyn;ƥU.wES(u4"mє>GSqSz5dK<[Z x1ʴ^yCP T9'QGJ9VHeVޗb$;A8+ aԦD9-΍+`Ca'6:# D{nY'e}ӝ)Q*{D5(~)[ ZHI: #.*z(1HnXv6s,hc5jFn^Kp#oa5S ɝ=:ckJS>uJ#B[*$7Ĝ硏sfAj(W~\ R0clRK ^#ɇBReϛ>0d0vρoOn\e<5*/rNxB&^f /N^מO_lyΧ^a/9yJ[R꣜e pXC ǽă0$ G0,L+O-WCO-8\! qc- 9'15%;0 QR,EHfRiyEqʉTZfFZfDVYPGXDrDy2e-W:n 1&C ){k7^XM1785H<@XTCS)SuYZF 0LUţ{jP"\p(&6y-}BCJrW2(ܢX20Vq](Djn <&8JσgphBQq[< X O*tD@gR[R ѳN+X:Ž1VD.<4DfRQA[kB.t]*v=݇zG%MƬ-'d .Xk  e C9H,w,lDX4HVZ7He"%  ޓ(ҰD }TacG@!Mgb}m yp sۜ& :ldbJ`*ZU97bnq̵01srɟkOծ0GXnMڊe> bȽ=FZDRĮ* 0VJ^:%;C^>=J%Vw~=gF:qtz58ԱS (C5EZI8wMmHj XfU;6}I'{Ւ;Ch۞7h뛈6h+`COgAO5a-Xu=c$L7v*rޒ+l~k5R-ԵOBYmiT`e#;T[ɝ~zjKʩÀ͸HPn&Ŧ(8|GisNnziՅ [mv^U!W~;agt%Ɠ#(#:Cr]Z^֍fB|h]LqrYRX߿b!YOR/ݤOAA }Gv@hޤ[@H|,S,+ ۹x(6 Bol)ʨy_Fr'2Y̧Zy Z#*g=Jps9i^>a\2dI IqCԼ&O u`Z?68\j狈6uX{ :H?L73 p[;&ҮLdyDMg7㠰U 5Q)`^o%J2D 6!,0:Eѐ0+XR.GN>y ;?M 9ii#i-J>+!YYexY[Q鈢k{о^I8rÛr3\ M<lRNժS:gT;b_/0ӓ?1qJI5/ ~ȥPZ"5Q#_ypjрCР!9 s5X#S %ҕbjLBk)׳q߳"`"a[zo K0VR1B ,PLRhKi8UQ=wȜq!Lj08'(Em"-˜RI2G>C5wVZ)Y|D#H[*OY"Za*8]Z]nr lE#1:=TLEdU(Fx^K̍J.38[.1iMڳ39Ti[MC'ok$_MMHyoiH41.axJ˱p}$* kTd􌚆({#6"#6*" Ɏ;بF I*JӛScV_s#1%!|K"f{'z > (n9.}c\2EYjׄf 3#Kq7˨/0/Jz15 - O^/#?uMLyZRbm̱T$ƭi!6ӭ}xشW 6Rmz5c0U_Z 0zoIcY&IU*)r'0i% ˔'D"$x|O5uRzH%%kRRvl0{3Or)4Q rx#`D0r|q(NBXfE3nJq\d 06T9%*=^*&GPܙV>aGj/%Wc1[r*@_uYU(b5c5/w hA$<*S$b˪.qI ;"JE(_( :pp3I;Hq5 u?aaMS6yf,&Ƽn@"jL%S ܑF[шaJVnᎴَ6RC4\+ 0n D:Kdg!?zG9 N$K1X/r I#a-y?$|fDac%_ u,I,1C4gZ"X总,nl6߲9c{ey8?%ۭd~VȌVwWb=u&20R\YL2 FЂ!Z(5ٔ# X҅\0u߉p#l~{ir"Ve{"\{WoT&bD5]$…qgt7t6^Ŏӹb-D,p۩*:!?"`\LWpT&.1,O89dsԤFP9W.zˉp>wؗg:js"_"\(ȄBs"\pc~@F83ҥSm0R`YRJU$P8UZ`ELJRefT<%j~EkTw55*oT&, Kzxɹxv &hǧPv(vhx!A/.yjϷ˟P*;M_{6?]4ۛQΞu~?>7\&?:_Ie<}br< I ᨰohb%`b(Wix0Ҵl!]GAZ\=hܡZՆI*l)fIVSb%euw"1Y=~n>~WG(lR _A@`ބ@VƠe sGX! ܼn&5;VWѤy~B ϤIfD9i:υ!(>fǪ's)JFdHf ۏnW[Y1[v~!9^gt/xi3j`!|?WȞrYOy ℂɢ38 x8k#ӆteqؓB'|жW6A[0?b[]wkfVSs)nLJi&% ) @(S$ifҤ<k GGZhV,HR;e$Ғs+JU"#q7i]hYRY %=C`~ʟC1[Q. sBUf,PyQ&׫Uۋuo6k!`! cLV'D1nO~){ IiY'nJ2*/9/Kmv_ϘCLr~XApp<۹ '@$7=dvnZ\Y4sb%W ̥]vYi(+< CL"՗Mj5)CTf.EMevSףkSz?O}JJ/ -K \H,_Ow/31dm 6_ZIZޯ6J㛇*rwS٧れhԋurjp?׈Zϊ߹^p vnnϱs?I@'w {oj wTAVT'q)M*+,Ŏ?Q-m(߼-~9 @2Y&jVoo¨\V,WUw*?ʃ?zmnsmfwImC]}cqv;2gzvKT_6&vi8:Q 1|4cEU߃Aɷ(]p0yn>dk_ސD?\9 d~Fb 7Ɍڊf!V  ]eSt1ou(!@rJB4cUVϡ ?֟ܚjI [3PR3 bյ]C;Rzam3IeKUA"V |Ŧ}u^Ϗ~ӊ 6#_T~2:Q?yrj,Zա=AV֓]6'.2B)sazj&9}=FnjO7 ѧٓu|swwmq_}dɜ8,O _C约/ͲX=,]ˊkl'H,V5VI;MQ-~LD>#Hvw<50v΅nHl'y:zD>#DwmVLeZbFt;=,bI+YsQLz ,b-L-DS8EG`JK갯تl!؏J"Y^ۣl KgS[\ ںgXOrvDݦL8o~,eP}B`ҾG(B>U78pe [@+0m塙=Goa5 `Β?Szs<1PT"ͥ">G40dRX*{~xc14k9V>?o$Jg̣v<2yӊ7ȅl%A+D\y%aRj©[2ft{RJ >Gk|4O0\ Xe[Be%BK(&5jL#L^=)zSjbl?9nXpA.X_ MbB5jqݧ ΃QLka QI|S5fW_Mfbbgv*="UD^\k-`|>S%ʓ`p#8[D Ҩ!y Vҕ$ʳ,=-s"9Tݘ6~ζjԵygg):]*AbffeNiZoَ>!RiJSk`@rMfA5ԀNa@ΩO_9,(ڠH0bvŭ*ʍC(&<͑$3}xUsNu-S_?Ţx%c=&1֋Q;/2`:6ߢVD 7_D Z{h!B` _ *5B6_wxbxsFאT2Eͱd֘LZp€;M7jCߞh0ni ׿߷zp.'}%UXX@dM PzpPBLpʩԘ3!9/eRq*1GI)u<{ `ZF%7чG @PrqZu+zNZJPouх!u=43E):TGEJ3g(F Ɍ 143 L~-,- oAC%~J܏/44lwM #K|c DSmw ɫ&`Jzwoz2G>L4ش!XeKgY쯷yqٟۇ|ݭCZ|Sׂg<~nG^_Mg˱<9C2C;/P{4s5o* )@l6Xj\G;7.12%jMjj7Skz>vGtbF4M`꾵[|qvBB޸6ysGY#|[$^c3lOp-ցqm-SĢ΍|u.tVXGn ? 54FQlזjIz Ry ֎^IY~wGEW^sгx]ݞ$!< hW ̀5o=,A2~`7x .쁚klX@X[yE(Z,4`!ZԾTDgTm;ô6vp툆.yf|@_ASYyWa՞aJ2UY A!6-Ꮽ5lO5 Z Fג%KMm)BW;׃k0+폽ӿJބ.-[rC{[{]඾zp.'j;/27{ Z\fO|qC5Ib}FH"xgFSzDxg"JE4J}Wڜ1[$^ 6K;]EnH("3y7ߎv[,>6T[|SOօqm/S2xj[GiXj c4Z X"FNQ \R?zD`ZihY*9eB C\3 : SLeEbFqr-B3#3S4kF.d&&Dϩ망h((qjj;MWn ;&Lej4u~EO8I)j#zR՗MRzR '@ $%/wVTh?)y6XJҊj!RzRJOJUT$%/TEERzzRHuyEneE򰺝g˹} rTOӶej/*Q>%7u1rfYv[ﬨO:.@_aEN>]ZS'}wYr]Ys#Ǒ+8Tee] yvH!UL3C (yߪn@7ȅ!y*+̻ŧ 7 OULNs?S#JPT,QIcbT䁟^Ji G\'}أM'_)C:/ޘE4nLw hQ}b;S4 P"\0B7Us_f+U,8=43k%* vY.-@~8 իVK:.Y,׺Fڴ)ξ$ z/FE% 588֗T(k\U: bJ ^PU{guŋˢk5t7nWKu3nVhzh;#$%Ze((&s 53L`VҚxHԆx.3 ǨgV %Dy;햢|x[dNy?<:ZuL+ꆏZ.kt zK|[҆z\H3 R x QpRJSl{+ Z˵"x;hŹtpX;4)la^>@.rT؉;OozMs;d ҉FmcPQ.6A#ut-8׭WUf-٢ w5$ OzsBvJ3`JoX<\.oW pypp )#0C0bt^S^C_xa3E9urް\'2z̉TZ3,u٠*6$TVTs&lJ1sfIX)f{+L_j(hIQ:)" )  C"AZBSyŢ:d7,j^r;gykZs$Yssh8+}T+)^ͳR)N.OJ.Ȉ:[髶RyVʰIX)<+>lz+Eg "ϳҊjDr^rg\uVeVTS g_Tdb=E]8LqEQ~zHdVmeH-m˖n\3lKT7բAuj !`=Zӕ-#Vv˄%x- Gbc[~^MT# RZp NPƋ"u#."WDWa!U v6N:ҬYsnUL+Ct5frE3 s 6α^} xg|MyeЖ/ 㞸PdiYॲRTTHDmkdRoa)ɒN鍊|JHޔr"B.Pn577̧#ZR A*޴ׯӔvJos׎0S8'qB_JxJov[ow!t!Н_zS=0>85>y8"cқVeKo\,mJoZ GrBx ;K+!ruқa6UNo;噬yu*t~.m!f \zCQ(,Z [P)J79o[z#2DUwDSiI9BIjo>#> |UJs\:}vȤd&v T<=Y*o>~~ZjuWU~U:aEӫf7r[oTr=߉KJߞ]C%x%Ey1[6gsFĩĜGe ^Ĉ]K3}@DI*WvF{_wwi}l.0e#+/b9hA0c C,3 NM @!LK䅈]z5X[R Crް¤2:9[W 0ځ4t JL..=yf!!|16]:O{f z_zOͮ>ywk\x"rE&F8.'*n׷11\]z}{9'Rl`{qM,gkw;~#bRBY,򾏐ec2W*j`EӴٽxo}_k[iB_B.s.|K!sf Y5?k#l%saQK2z$Gbiu^\&*.#lkyS{]* "տpZ71rA y~7/~],)uHwUtHn},r^TCk ofVїgCA%D8j濫x/ iQєV{aR[P-h+Q} :sx@zn.CC_w5J\6 v`E;"ed Lp[HznN%xa\}qPl6w.`ݺUѝ$5.?:{ڐ!k-ZUmVk1Ь]k1;Ʊǚo0/a?uvb&O6/pMKh<+Em9TFd= J{+rg%rV%.~vC2:C1MY߯RuW7ga?ĥ`A9;n`JiSsLϤ,XǨ{gA ѯK D:(9l-ʷaY}qA_C;}ڳ%[!pO{s,c7Xbཋ`u% &]?15Jr3Om2sYeIc tJ[A/r!UDNR( h$az)g_+PJ_)mܵ#)D6'1OlT]pG>{Tnxݑg}K&NQhM&89AoTpS&RyΦTÇHmQܿ<Anڶ]m۟NmCL2^ .+[jO-ibdmE˶h6*⠣JB?tԈh‘J5D, 1 {sf2s 43ߥ9 4e3/ػHnWtݑHqps.=wd7rXTRƈg=Iv_lT]~OwYEEr7ߟqD_GN_SnY3wad}fQeՠp$g8Ba\Q.͟7{>B4p=tT2g s*kpѤjÉ13"2%ٜ7Θ&ZpDzzG7zY<3@` +~SWUFn/3< 8̽h\D/{D{mBR2JXϯo\:۴E SR*x,$&w $&f[3J q2nQͅC>~ 4oa2z{*:İӉH+Ĝ%lыjLeM ꆱM&lqfQmVmЙT녀3pvXv{; y@ ȮRN/,M})Qj@r<{j9nWIZ[u|7yܲR=1`/,՗goW_q &:ٻ({1qWO?GwxH\񕕝_ys}e|u鷫UחM,mnǺec@XjCZZ>۝<9N̰ޑ7~篿\DIl$s6h̖G鹄VFT  ̛޲SaZOd n:SWza==b(b=W 1Pʧ+n+DC r5H׃L<:y04POBjh zl+Ψ[VUto]˫/)n}_땧jfMo~iyy]eU TS>`񵝷)υ T.x|5i|:ݘ[m4'߆`@F211I IxǓ |7 :y]&H0'6THcW5ɦligPILLhɅ Q̪LC HN1ߒw( -Ar"K䂴\-70=0F*Ӳ𒙤ܤ@m|In*jRA/jUNjϐKY~0Xm=L\A#) SAM hYly{i۟册gbvMRid e=G&c<" #>dRqRb(y ;KUճ'̩Ej4~5yhQ@0h @{\0a4W] fXR˥?03ބ x;&4[DiD!-ah BBE-?mۥ(EnBW {˘wډNAtTz'R#" D{y$LuE,psc6N~.9[CHy~u9r?qhM؍UU&сwi 87Ō0!E$oyѭ`MЪ8N9~s'JfhJ_}'F#f}Nr/ m ʄ@‘m#w03*-Z-,bAߢGhZŝ4rT1%(wB\g?Z$˰yRwKn 8Z}ZCCG#%!-yj+1Xbq QAwZ]<0@z~qd &Ff ؐ'b%TW+jX,><h q ,שBR2h6f?_]E`\eЛ]I륽,X_CHg&0%6/4qIe=&P-ycÅ9U+j?t~+cփzH/'Uo855$ɢFZ6`y9b 4ڦ%`H*H&~V*|RS^Du5M507 vf)@OvV{:?lX@_ }ё.eY#_eK \МI肑w4{󶵈^* oq+SLpXMqu_/XJHM:|DRvjv+[!NZFpvǣSVZ|mȢ2{ӌGѤ z|ƚ{E Qnoq9dk]rEuygYheo/v~k6YR[>cnkfrI?uk,cYZ1R?8:piD4 oPprh^-xNH$d7JpngC߷ֱ?ܷ12YNp @OYB~9n{}*X+m.iٌb<ܝ rDO WIﶗMmu^"yZ_QO.d T$Qtq(QM[ JR1sʮs9h#NBWcoc$#3e+¾>ijNܬ?tn3܃ȟ=Mc [Q@M=jշ?@SVDuuFA0~=Q n8 lM!T?h弽|w>,}PN56O-^jrqez-23CS025ؚ8q 2Q֓n2`@Z#md#,O YQ`}5OЁC^="`/Ca&qv6P/cMtжRzc j& hY*SR1 <  *e6 !h0v?ng" Ve{q;Ŋ~g@d!n/:2kA.8R۹|Z$ 5펧RsSmnZfw3 ()S{*Vv::uۙ}B/pН >a38  n/{B9T^)|4?^*sw :0C<ι6!FOD\9a't9/ d;^N:%K@[?Cp-#廼-2h*׆܎HkVédb)dy.Aj(#CH g} 7H91B.9w+4šoCH0xf5k Idv3 Q,nlU7{E1έhu.hJ.ӮU*E{[x` '9,b2>#EjWm-, BdZĹ&Ѧl(\{MEíphVcvex]o~#Mup7\WNvX(h)]1y^^Zy7Pp,7Hn6^g2&otSzx7E2$ˣoY{߬]MDMܳ l2٤D~q wPƉ`W=R+xsZpU=..Rރ[)P 3o6=*(A_JS+[mU ΫR^kLӗlh [LR3`N jZ$#OaI+i{{D_I)4&`.sh S.bU$SR | dXS `z5e)`SQ׺Ub*F&}H]g3Dq3sRp'_4IڗQo}!leot/#X26g67n{/çB;e j-٫6&f^.z>ɚ[hN^G@`2!RbRjVŎw=-ARE)7P)N3fDL=R0w6eic3^XG $8.=/#`HclJ&*T\C (ΧV]*@Jld*1Hn+vTTp$y_ljcH [YKX;SJ90%{Xq\(1 E' =R2R??$xp)fNL *O_Wb~^n%z8(3/>FFϤ+4-x|Q8yy<DЃخxMRL]qLy)ޜAx)y^ZIvKKR\Sz)y^ZI JT+uף[BtTyxf-/` ʠ4p~ǘ5*i %i$03GHS^) 씧$J ץlJOͿR򼔱3K_)uKK9y)Ok'gZrJj+%lE[-(\wBjl83'f?I 5dLB 6!תzp m-h!0p1A]ˇeHNFRVK(cPxǛ6kŜu!A1F"lFȈ{&l6 Ct\bfR=\oN7,k .*GPci#=*!Zo'E~L1~B-ⴰl\uɥVScuPT#dhY:ٿ|~pN/VU .ۼ'g 0@zZ_웗\/@`}zݜrY} p4n%]SԞJ(K꽴ҠȉVJꈡ\InE5X^gϵ⪶+Ӈ▥Aqkɏӕ6 d*bDC9hPQ&Hor)nKU;eJ̦F)nyqMqKq[1ŭEQ3؋6҆J9Zs{Xq[)Q{v}&x9̩TX7I {s#ً>_` \6 [XzRܜ4 ABP"%)n9OJqҙ\?.eS pYEp${]˦Ԓ(\RA qZ*mT 57y0km@9ƒ0dn6Bh>ᑝl6 95bC3s-Ax £bOfU1EM0JHAXҽHmdTs#BS/EG6ױR[lZ JĹ2 t֔y)8P&C@aAKƜ3.]*8]im[M?L%œ;baT=ee7o-mvW9R2NݕoY^[Bߒ58߲c~K+"jp~˨K~qsB.V3vJWVSP'U̯hV_{QZX#|4~K1d+3`.wta@$9UW8e+HP7lJ+@!-7GG*\egWz 띺] & N=&8FoK{n4z[ RX]ByC G t1KB+4+MB q0ŎwU{f}k/ujC 0tӓH)Eޕ>٦m@!KFhKr5'bIJz1yr1_u>XJV 5Ĥb(#ʇ~kc9i3 XBOmɵDq-q S^@eh`RVz2ZhZV[9z#'CJA)]Zat *-iee,ֺzzs-. B.`o!WJ-2 ʨk~ӵv-}SZB?KgZUHB|B5A9ACYӱ{!!SV{Xq)ԻXqTjPapQ]=ƪd (G!XL BȻ5BXf!RKv~v~[%_zjBTRU7ԍupN10 R XWԀ%5BA*HrH5V6L!\>.i+%fiK[izgOX{×,bSBP{׋X_:LVu UszVLڎiTDt<#⪸uHU?cw=Z=Lݹ@=:)z!"lٓ"M/޼ :K>-b,\ u>ZWyHc5﫹_h̍fs f*zo=koc\!jQ{<ȉtJe\ԲԺsWKDLQ%3p"+ӢR5@y6R7ޘ/~ywrej ygmsua/2-aPoViK}3]Au/58$w |Mwkm+cl<(g2߿7wqjd&0jBBfv\@JQfNu:9ފ5۳IMRƕSJ1 lO/&Rl:E!cexRN ~>E\pNdSb 8!F+202uT?ŧvЌh?dS\jgP !9#J #o?lz4%:$B.c-Bh5kIDPݍ2 Bzs7F1}4Y>({cX!c\ltT-1 l `T]m*ғ63t#C=(nF-4<&F"EVԉA8 ZAdCԀՁ&+nNJ7͆!"Th>:u"ZS<{W0bfri$eɤmUvp1q0ޤ/;ΙXdףga\vUl[^~w)Tȧ_˿>LUXnH뛅~ϫ8c𢞭f?T˵E`NqM=_bd8]@yQDzm>O̙P9ݮ×/Yib۟41<%)0C+x(Ch#llXhҿK4M4 ip+5EE7d0pNH, sxd 1̴F+hMȅ*r03ȴFRZMTNW@@ɾsOU}?]/_S^Uar kh=@1;A3ؠ|jWGGݞjoԬ)ڛl?jSYo&|Wd?X3V5=k._c?-{Kkaҽ[=hg[=1ȹ,@bs`_ceiѭ`/Roބ܉Q6NV3zW([!$,{SB" DKШ_wjTS4*aoC|=a[-u(=lhGM{ݥFzc}+(W tZ)vc u0ΘV|: `uzG~t X\-tGR.NrGIAYL0|;sXo7L2in4/%e=L櫺Z U ~ s~{"_KwGמ~P_)i?4ŒzcEgKoW&'<4~yhv<y6h=O qVqm-)Ȳu>n-M[))S9m+hVފƈђRȫ8OVAJ~GNv[pa̳Sg~6FxŔN)%ulܴK|)O8W:5.GjíLT|m>nQZ GшF3θs+m)4:!ENV蔒AE AqHjI13]-fYYJ HZNp5X4zZA[BZyF+Mqp@mtd0UvAf>I N#: Po,TTZ#;|B*uC9STߙG|dtnFW im:Fl V7 lQ{UBLKoTtk"A4 j&%iS5Ua&^7TW1}\=dcG3b{l+ P;T@<f\Jzp%H[T A}'$HdG6O!'ig74{COq?8Ip꽡(KmNf}Yrndc Jx3ؕK"ţ++N9q(ީ>j NF 'K 1۱'70>F>?Z^m*soj@Mc#]>?NyBqc-!aAuAZueit+<Ū,E-^T 4'9sU1Q1XWҫD2q~nKKC>un:lzһcK\mU 1-auWxzX'A2i#cUZO{VC1N C EdGqΤV[P$]CXr?1}&9=qݎ҈~n1EOXH_Vjg-5;%{%Vͬ$Uk1Ffti,4e- DL tcI$DT")5 #ICDn㰎?֨NJLcE(۷{Y$;-">OZC Tl5fG⽳P [ղZ7V] : 9Cޒ%"0 [zVP9b <5Ҧ[II/[S.l|uٽxތQfywYg5rc:>dX~~̼ ]8{>JbeJ!d4:'-_tA`sұ2x zVt;5:J7' Z zmc[it7L0+[-`rwbu&)R~GprBSD1"C$4'͈δ`v+ %tJ#'-XwIꇩ[ykbv#Bn8߷m6U}'h]I8[Jt_ o~j ][Yyiǟ{'%޽ݽnnͷm?!umY<~ր/_r˿޽] r+@\ -iIydBFm)#$$;"/pJšٛ{C(U‘sYI/HAz+uqU?Z*4ì.;cǾ&h_O3beg}1i(w gP=θCTK.Kg;0SJq-Lq9Qltf2WL|tR`?NA +hRz\-L˖fuֱvNlB]MelɃZpg =As.ѩ #Pt п^EIN#z$:b|=fcM^PcD1f3.ob7@k&J::,3KQ B$ݯoh__;ۯgnÛcяhs o ~Pd&;ow]O]}|ݢÇOWmçcP'5O.;mۅx_HM # 7asP7Fx8pO*N蛞q ;4u `jtßW 4Sofo(jdC;w.8Sb2I41HTh*϶33:auU+TkA ڸBGw(I;Z (?Ihnh03Mh+4Um Wb ZbYa+)m]PO.,uO1hM=4kOrmgZ@{m!XpCN(p(qmʹK8+I992ij0]҇c`6A7T90i;pu0~O+e>7c,'p]c֗%1m+d7/}єbW魗3͈.3W7wגjttG c) 역;v_/_(ex =z`FN }4g>n>dݱ1NyLA{uQ2G#e8Vs(NѸSp%fTdޡޏ+-=S-%]\/XK #AaA0]f$D c@ZWɍw4. 6FǕMshumS,4nUjpQszϯEbTf$G|Njpb+` QVG>U}ݣ?T3Q۹(ףnv bM7Zr q11ܐ^+ݾK׮< [ BqR@dGPCJ Q`LiQʪԘ`+$#b5m~^WH hQat0<Ҩ ޛ;c}Cn@4 CFt5{h/& W1$`|"Qp!S`#T0@U]1>AQV"xz+R%',UכF#iiKyI9B%6ҡ6 F*#lˤflX9)mph (QY<`ϩ8hIZ!u^̉aQ*洫3Ŝ&Slur` 9r%GgpiJJszrҧJSal0JNCdZ@(D:-,,dݱ1ˈWĥ#_Úf]O]f!f '~ԳO1 X9m#q3j+'bC%ۉz6Zw,9U%^ cng 9GbC.#ۜ9]z}Đe1Y5C=o\!FÌtyCTOdtI2[F COĐOfD BL+{ !DĐ9SV Z2 ʣy1tfg'Z NGjξ#<lnuΌH4(_اZYJ'pxU2Ůnu{"ޢ̰WO VQTJ9 ʆ#֐rq`Yhcji$ѷ~GbY5:{Z3 W C#DȍhQLXwX"b:ߑݖfme6FxŔ48Oێ0s٭)*"]~%1!7G1ecn Ӷ/$< Q&vIjACt\&Ґ& a)ToKV[((MHn%j1uـkCA>Η>S,]\Xd5D8RN,Xﮬ(wDƊwٻ޶r$Wy`*TȼdXs.Vg 9z ;G*&p|v`q&YaBza^}h 8 Qc FMJ'x8,^+Nhk^Mhk? t\ippV^ HԶxKinݿE`OX5hoڧ4{>é{B#>ͪC@ݧi8~On(u7\"Kd͑obxt&;Zl~SNQ9 oz j k-r $M4>hmcsMvVe[PQÒ$JU1FG~u&۽u:3n.z~z{ti;]Ь :G￴E)י?i,]ɡy^H#-eO)_qKTVUMK<9Rv\Bz~km w`AT~o"?[obO^!j[~kiMp^?絝nߚ{?7m:NÁ"|IhE\<}p5'cPĭ4ʂt>m[ 'wEzjwdC{eom%Xd'}ZY.lN̡9كr~+ &_hl z8F&4 ` ҳ ;@*{[ްšATX=uwn3zs$ +fs-JJM/5+-UھF_::yB 8n+F+ 9^h]\JIR1qX1("uH ۬a Xk"}s IT;eOsmVUCOfopqUn»mv-F7l7j9񱍽7X e Ƣv5EN9ܱy1ίΎ7F"q ut9ݵO4`^i2\_D t3ۅz^Dz F9uRO`}N)a4dQKȄYP*6FgEIW2j} +{ֳ^#p̞eNyn\m>8k9kf{]/}쿻™y6QWz2GsW·ٿ!bVTdX7`H\[NKJ(Ub.ed- Ru2ێ0@ 2.G|t^&f?R͟|f +u{v9i+=':NjȘ3RdtX2KS='ۊyodWqz;(D3k_! T^XR;$5bئt8"wVj%^r!6/+ nswM/4x+zPK:Yb'_< _o%D$zϝۅ9vVMng"O~:(O߆`o WBZɶHlBl@xApi wN6llyښMyfjޖzWfTcYfc]jWVT:UB5ȹ/uMX)4V̩>JJ}8 47FJEU6iBdd&ein!DR&Ƹj!ΏzskVp[G=q/>72^E>Poә8 O6rϗj<#JClnyNHJ>נEI29R$ o*q܊Wu2\xSLzھ]Re>J.'Ƹ?z$AVdu2)O9 u_&DiкW=&LFZZDivl 9pI s35h¢j#,Ef?Vu/j{9RyNlwQ ˳A0+%g κz=ߛ帚帚帚owoH7""3%P]@wZ*s^z 9䬣ņaljT~=W{{q*/mr:oٷzzGc3l>k7|"<}FkUH~Bk^iS-컊pІCB(o{;כB O;ǸsT*5Ա(N%;Y唹q;u'*1"/cPtOİ:շ@qB,]PDv?"\>8م<إa:k4*tR#!6PJd~}$uW&Ejx*%[ϴ3؞ f.^THK0Z[O}L5*|Q#⃲iCN  |>;23lq̃uŻfDj Fss0@obVmS`N]gǽXƸ/^iƴYjZJJї¢hӽ _ tN5PddgVAwt<^Q$L˳a\}(؟;cK~ 9%1ûqJ:% ) /'^|R#~qLDey+rKjzxq˙R;S ޽ז*zvnfR[K|>bꕤVXd$T#DvyR<1ƳDw읟 l][U>h:{dNe:$OI7/ p@ڬ%B,7TApojjj}+BrNȒ*6+9 1R2 *(EKsTA+ Z,Z`P|(Zvk+r(kb-f_HC&Řr<-JvJ+Viƨu&]m=.Jv"ϗz7ixqA?tqͷ<]^jcȮi#q N瞀'!Y WgG"ӦtqIQ9y΢t<]RF[\H~TxDѧs#oR6`Z6v"8eZh"T ;M-p=a#+Z ^.rZaZ\͘0-\eí 2J1-:YJ8hVa;s sػa혗SriG66;k0-" j`LTA)T,N'«"pI[RfL˖7Ŵt^1pa zTMǎ(XgsdU%QtI7IXBqr!1QuRc8Y,@k]QTE]kOԾ{%m oK:TE;A_}i Jb$~H6$b>|T7ݡ@Z= ?Y%9> KنO6oGʒ1{[,sW¬,2d^82ڔ,dIZz) (`%p6sO>S$n'oUQ( lxwҺbCq_Ml#O}2֦ގscfg8[P|xsA$RAh. Sױ"WJYXpV[a:#wżuʽo.n7/,n?9īqCx% {+Cqff÷U%qtL1E^ #uDkw@3^Pc"m澐㙟_ G F8P<#Oz6gƍ,E#J۽%g"/|%c4O( E $AW_q2p{Z&rc~/cY:=9^ysJQy$2gϗsn(5ih\YR*2_7l*`s b3늭0Mw_uߘ۫glENߧwReco &Nn(Q[c32d]ɳgTH4l6DLtJá T?Qww61,ABoQYԡg6Eݙ{.āJCnWSOMgg Oǫq9iQ, /8H z(޳OO=m*bfe]!2ZC$xػ6cW M"  -?CJҔ %R h8S_UuuUuw9{lU ImɌ\$F~_G}(0#1A7 WH>-_uIӃHv\Q-W/NB-e07)d{~\;W4 W~35ӲsD'aS_;{>! *knLPSo;Fqakj?0CJK LD *cI.MoYLǟ4Ehc'm l律귛TkVkHbr;uEӝ$u  UD;#s{<$L E+Wj`Z|ί<~/GI_KG 7Ne`֞Ws&+%7AzaHxHهjַV 6;!GL@Y %ل~е$̐Bx|?B|zB+7<<1ҞjS0}-hoۣ.﫤T6(DkFI~Qϐq%a`S5!WqCa j A|Gd*[2ռ-{%d||[>;E59;k`fE ˇJlh`Fv6Tk;qد$s{FݼoeN{,ҊdƩEwyCzýF=`L)34u|ƫRp;l`bI j_1k+Y P)r7CA5c.zg#.p> Q1$a깡Xvv=R??iQ` =n9*Nb1Yn-1y˖)<)$3m 5U&d4BN&чg8FC98d# )FK&{۟u+1^& +Ar~2L}t2YZ8GOaÐk]0Qa&.6Bc PfU"9s=0#Q q#,Ił0J5W-.Ob1lE/cxk̑iG X̹el,ƖazV_ bFhzF> Zqb.(0ą Q !>.epN$1"|I!X0Vk4b֚Vnu1LFt13T#g-vcdFtm9ޡ?zq{xS-V<W`c ,IZ*', y #T$f%K)c%\ dv6:UpC"iMO$&}T2:@3 J0,jeB0=rF5q B"]ett1j1(3H%e44iʍ'ɥ"DJN2I:~#@rAgZp.7/ ΰfGΈ6X_x 'Gψ6T]1qOE yAc8s;4 oY>A5b&\!gATZcqr:*wel/˃ǛץhZwc˜rϤXY|khb4}6yYRF,jL=ڭi vCl#q"rI9y>,eL!͡mni XҙPd̍Qr9XJPM;Ւ+w7VIi뗁+\.Rھs-> lK#V0񀠂?@T`;bW06@;|KV"-SkxY[u3C:]k~o70fEu(S \^3WM9ڗ81N!6gF8KdRX˃k~C@?M.a0{8{%/?yr9IvOyW81V";kO̎*vq>.F).+V`;3:Q`?Ť^&ceUB"o֟5Y\ڬC~$UagrLIIdyk<[j~:oDlwJ8<(Et4ԄpF+od+>9U]5_޽oߔoԸ <:RѪ'P&/ ~AyIu|0]b8!/|_F1[mYw0,zM(Ŕ˽ňͅ>%w2)OfEd2J^&Q/]lDoۏ`M/2ڿ5{-Y-=,%ɲQ /lE6{๫A!Ms5oT *+sXč)1P{FlHa:jʬ3B01ymYmtc < 7`ǪJou=jM=L]1X,rF<\) .a:̿G2Zfg9(h\pNߌ?{`"p埋y.$uBӸwv>mqλ=[ri_|D -0:M2#p@e<`j#`Û?,>O7`lŌӺFWkߢ2C0||9tU?n\ن{]#p5 9TB]̴֔Jh\+x4]߄;,4,bk lz?`Z]٪6vIg"LX˟hQb2kl\'3 gM 9Tz'B%T j"2@2PͿ-9h}u:TʭtV'srq:}u2^l4Z q/.r6C1 '6:~Wf4?<㬺Fg6?eZ[ӫq%M{|RN\yi ]gq^2ݒ|*DDKdF oy1̭k1ș۶igio%;v1OЬV1yߦzxioCCs])Q*ګ\өz(4^ "m$?MͅLQ8W ^B͍l-#{X͍S񵵲DP3+ސC@5 eV1۸A!0|PxVzo?'&cɤ} z4+r]|mO\r+$1I0]2\SNzT{iUQ'Fy#HmO-؃UYӟOW)^ HP9|xO^A1֏Te*jdʀ ri;9ܶ^@$#)27$f4W七tC£Bw%Эl,QI@(ܦ=U

ZoBB(,d~Gp7iQ;7߾C4wMg??[S'(=7m;͢5Z-iNyK$ R,p#J ,D> emC! ,L9l ||,Ҧ~Ik )?{Њ1 : :wgB2/J%\~)n0fyL0KhL@f;$rrJ^ׅ B8RtMGt S&M'p Bgvnqk!PC2ޣ@gӉLH26 MP䫛(}0l&@q&<@7>1D gNY` < @iȖw\bX"NC`޲/.JZ-Q+ݰΐ!z;b159cRi3&=\J 6cj:4`N*prR o1y;-ѻw}:&9vO(HNh2D#~|}%7F+_`N흳aADjqn6r{@a\Njs4#,i]=$lUAb] Vc D~yH90 S^EGyˠLnrRHiHR! ZġibwՖfj^_,U"dk8˺ƮyJl;82|!uhj:qCmö:w<-0NK|ꃨ/uMfD@]PĦ%4I> QGLs2!V:+U8ePŴQ j}ԞX"|IRx1Ebc ԓw $WV6S1- hjc KRc!Gx>)6^Cǥ=]VV:]n eޙVrg;A-+1ec j=nzؽo-? ^Zhs0껰A%-r}d]/*ֻ>-Vػ:3Eb+ҽ2Ѹ;liU$Mà)\~ UBuytr{Y_hHDE2aɿW4b (lyǓTֺ~Y.T˽MJ1vKUN:B"tk Un(*5P裝~F\\;_ڇ]H" U. q+f7p mcQI zη=)KaU'bWn괢ND_v(hTrv{;C~eH zP?Jثí5U.u4;°=UFu&@pu/XbJB^τzNZ.O!Zk*1sA5P FG-b@Ai,rS"e(W&&8CY 1I Mz *FKfe7/\U^bq}eR}EV^1H*rŐ5F,55ڪ똠/oԞ*EV.kEڌ$GWK}߮l{Od+S0"eCRp2FJX⸤0@}@u!\ڃyƓ>]Ôu,m.Oޛbnӗ-~/OIe95g2}=\]/hD3G~=87ӗS>.Z: PdR0}~ ->?*g}:|—'@@45w$q 7wIxwrJ-}ty%8Dq4/H{8Ifz{j AP S{̱*;Lƈ' Ȏg ûQ)((";\I>l) k HFkX,K;,var/home/core/zuul-output/logs/kubelet.log0000644000000000000000004016273415157232605017712 0ustar rootrootMar 20 10:54:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 10:54:33 crc restorecon[4757]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.861021 4860 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901839 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901891 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901898 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901907 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901914 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901920 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901927 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901932 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901939 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901944 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901950 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901957 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901963 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901969 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901975 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901980 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901985 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901991 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901996 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902001 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902007 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902012 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902017 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902022 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902028 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902033 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902038 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902045 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902053 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902058 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902064 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902069 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902074 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902086 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902091 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902100 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902106 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902112 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902118 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902124 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902129 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902134 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902139 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902144 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902150 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902156 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902161 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902166 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902170 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902176 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902181 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902186 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902191 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902196 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902201 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902207 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902212 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902217 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902246 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902251 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902256 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902262 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902267 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902273 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902291 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902298 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902305 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902311 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902319 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902326 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902335 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903387 4860 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903410 4860 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903427 4860 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903435 4860 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903443 4860 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903452 4860 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903461 4860 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903470 4860 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903478 4860 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903485 4860 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903492 4860 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903500 4860 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903507 4860 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903515 4860 flags.go:64] FLAG: --cgroup-root="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903521 4860 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903529 4860 flags.go:64] FLAG: --client-ca-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903537 4860 flags.go:64] FLAG: --cloud-config="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903545 4860 flags.go:64] FLAG: --cloud-provider="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903554 4860 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903568 4860 flags.go:64] FLAG: --cluster-domain="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903577 4860 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903587 4860 flags.go:64] FLAG: --config-dir="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904563 4860 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904577 4860 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904586 4860 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904593 4860 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904600 4860 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904607 4860 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904613 4860 flags.go:64] FLAG: --contention-profiling="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904619 4860 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904626 4860 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904632 4860 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904638 4860 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904646 4860 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904653 4860 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904659 4860 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904665 4860 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904671 4860 flags.go:64] FLAG: --enable-server="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904678 4860 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904686 4860 flags.go:64] FLAG: --event-burst="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904692 4860 flags.go:64] FLAG: --event-qps="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904698 4860 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904705 4860 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904711 4860 flags.go:64] FLAG: --eviction-hard="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904718 4860 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904724 4860 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904731 4860 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904742 4860 flags.go:64] FLAG: --eviction-soft="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904749 4860 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904756 4860 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904763 4860 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904770 4860 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904776 4860 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904782 4860 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904788 4860 flags.go:64] FLAG: --feature-gates="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904797 4860 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904805 4860 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904817 4860 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904832 4860 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904841 4860 flags.go:64] FLAG: --healthz-port="10248" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904849 4860 flags.go:64] FLAG: --help="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904857 4860 flags.go:64] FLAG: --hostname-override="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904865 4860 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904873 4860 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904879 4860 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904885 4860 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904892 4860 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904898 4860 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904905 4860 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904910 4860 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904917 4860 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904923 4860 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904930 4860 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904936 4860 flags.go:64] FLAG: --kube-reserved="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904944 4860 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904950 4860 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904957 4860 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904962 4860 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904969 4860 flags.go:64] FLAG: --lock-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904975 4860 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904982 4860 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904988 4860 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905003 4860 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905011 4860 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905017 4860 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905023 4860 flags.go:64] FLAG: --logging-format="text" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905029 4860 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905036 4860 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905042 4860 flags.go:64] FLAG: --manifest-url="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905048 4860 flags.go:64] FLAG: --manifest-url-header="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905056 4860 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905062 4860 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905070 4860 flags.go:64] FLAG: --max-pods="110" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905076 4860 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905082 4860 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905088 4860 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905095 4860 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905101 4860 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905107 4860 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905113 4860 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905128 4860 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905134 4860 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905141 4860 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905150 4860 flags.go:64] FLAG: --pod-cidr="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905156 4860 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905166 4860 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905172 4860 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905178 4860 flags.go:64] FLAG: --pods-per-core="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905184 4860 flags.go:64] FLAG: --port="10250" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905193 4860 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905202 4860 flags.go:64] FLAG: --provider-id="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905211 4860 flags.go:64] FLAG: --qos-reserved="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905222 4860 flags.go:64] FLAG: --read-only-port="10255" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905270 4860 flags.go:64] FLAG: --register-node="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905279 4860 flags.go:64] FLAG: --register-schedulable="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905288 4860 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905347 4860 flags.go:64] FLAG: --registry-burst="10" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905356 4860 flags.go:64] FLAG: --registry-qps="5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905365 4860 flags.go:64] FLAG: --reserved-cpus="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905376 4860 flags.go:64] FLAG: --reserved-memory="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905388 4860 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905397 4860 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905406 4860 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905415 4860 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905425 4860 flags.go:64] FLAG: --runonce="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905434 4860 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905444 4860 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905454 4860 flags.go:64] FLAG: --seccomp-default="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905463 4860 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905472 4860 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905482 4860 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905491 4860 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905500 4860 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905509 4860 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905518 4860 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905527 4860 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905536 4860 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905546 4860 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905555 4860 flags.go:64] FLAG: --system-cgroups="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905564 4860 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905578 4860 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905587 4860 flags.go:64] FLAG: --tls-cert-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905596 4860 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905609 4860 flags.go:64] FLAG: --tls-min-version="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905619 4860 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905628 4860 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905637 4860 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905646 4860 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905655 4860 flags.go:64] FLAG: --v="2" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905666 4860 flags.go:64] FLAG: --version="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905678 4860 flags.go:64] FLAG: --vmodule="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905688 4860 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905698 4860 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.905987 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906006 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906031 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906049 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906061 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906074 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906085 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906096 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906110 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906122 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906132 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906140 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906149 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906158 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906167 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906175 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906183 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906190 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906199 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906207 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906214 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906222 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906265 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906274 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906282 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906290 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906297 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906308 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906318 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906327 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906335 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906344 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906352 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906363 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906372 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906381 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906389 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906397 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906407 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906415 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906423 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906431 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906438 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906446 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906453 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906461 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906469 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906476 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906484 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906492 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906500 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906508 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906515 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906524 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906533 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906544 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906554 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906565 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906574 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906582 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906592 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906602 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906612 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906622 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906632 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906643 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906652 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906659 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906667 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906674 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906683 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.906695 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932192 4860 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932272 4860 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932365 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932375 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932380 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932386 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932391 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932397 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932401 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932407 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932412 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932417 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932422 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932427 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932432 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932437 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932442 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932448 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932453 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932458 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932463 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932468 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932473 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932478 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932483 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932488 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932494 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932503 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932509 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932514 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932521 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932529 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932535 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932540 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932546 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932551 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932556 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932562 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932567 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932572 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932576 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932582 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932587 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932592 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932598 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932603 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932608 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932613 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932619 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932624 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932629 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932633 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932639 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932645 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932651 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932656 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932661 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932666 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932671 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932675 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932683 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932688 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932693 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932698 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932703 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932708 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932713 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932718 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932723 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932729 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932735 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932740 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932753 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932762 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932954 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932969 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932975 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932982 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932989 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932995 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933001 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933007 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933012 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933018 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933023 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933028 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933033 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933039 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933044 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933051 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933057 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933063 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933068 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933074 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933079 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933084 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933089 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933094 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933099 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933104 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933109 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933114 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933119 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933124 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933130 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933136 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933142 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933148 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933166 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933175 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933182 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933188 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933194 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933200 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933206 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933261 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933270 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933275 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933282 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933287 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933294 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933300 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933306 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933312 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933317 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933323 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933329 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933335 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933342 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933347 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933353 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933359 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933364 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933369 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933375 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933381 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933387 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933395 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933402 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933409 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933415 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933420 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933426 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933431 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933455 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.933466 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.933747 4860 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 10:54:36 crc kubenswrapper[4860]: E0320 10:54:36.954391 4860 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.960629 4860 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.960775 4860 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962624 4860 server.go:997] "Starting client certificate rotation" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962746 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962927 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.054369 4860 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.057372 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.057508 4860 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.098571 4860 log.go:25] "Validated CRI v1 runtime API" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.199614 4860 log.go:25] "Validated CRI v1 image API" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.202081 4860 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.207196 4860 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-10-49-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.207287 4860 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.237194 4860 manager.go:217] Machine: {Timestamp:2026-03-20 10:54:37.234640394 +0000 UTC m=+1.456001372 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5064a76f-5382-46f7-bae1-fe91bc80db78 BootID:d21bb8ef-2c26-4952-9b24-e8f54bfb6e63 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:da:04:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:da:04:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b1:1c:a5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6b:e9:f9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9c:60:c6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:aa:83:01 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:bb:20:d7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:33:e0:9b:18:d8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:1a:6b:5b:9d:a7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.238037 4860 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.238404 4860 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239046 4860 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239561 4860 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239781 4860 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.241124 4860 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.241356 4860 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.242301 4860 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.242456 4860 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.243300 4860 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.243642 4860 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.246932 4860 kubelet.go:418] "Attempting to sync node with API server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247061 4860 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247164 4860 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247280 4860 kubelet.go:324] "Adding apiserver pod source" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247400 4860 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.251732 4860 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.253114 4860 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.255741 4860 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.256885 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.257019 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.256889 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.257198 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257125 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257380 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257431 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257478 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257528 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257576 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257622 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257682 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257736 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257781 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257847 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257898 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.258721 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259031 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259408 4860 server.go:1280] "Started kubelet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.261422 4860 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259492 4860 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 10:54:37 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.264427 4860 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265317 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265393 4860 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265656 4860 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265682 4860 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265869 4860 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.265709 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.266444 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.266512 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268117 4860 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268179 4860 factory.go:55] Registering systemd factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268193 4860 factory.go:221] Registration of the systemd container factory successfully Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.302260 4860 server.go:460] "Adding debug handlers to kubelet server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303282 4860 factory.go:153] Registering CRI-O factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303483 4860 factory.go:221] Registration of the crio container factory successfully Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303504 4860 factory.go:103] Registering Raw factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303520 4860 manager.go:1196] Started watching for new ooms in manager Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.302405 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.304634 4860 manager.go:319] Starting recovery of all containers Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.304620 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312451 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312584 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312652 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312688 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312725 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312757 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312775 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312797 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312819 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312865 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312883 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312955 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313050 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313071 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313089 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313139 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313159 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313870 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314546 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314587 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314615 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314641 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314672 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314694 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314719 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314773 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314796 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314906 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314968 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315001 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315030 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315058 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315087 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315118 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315147 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315175 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315208 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315275 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315304 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315334 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315358 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315381 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315403 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315424 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315450 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315470 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315492 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315513 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315536 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315570 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315597 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315622 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315651 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315675 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315700 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315722 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315742 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315764 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315784 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315806 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315826 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315845 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315866 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315889 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315912 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315931 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315951 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315972 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315995 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316012 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316033 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316053 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316072 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316091 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316158 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316179 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316202 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316261 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316292 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316315 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316339 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316360 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316383 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316419 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316486 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316513 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316535 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316556 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316577 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316601 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316623 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316772 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316795 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316815 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316840 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316862 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316884 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316903 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316927 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316951 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316982 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317004 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317026 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317050 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317072 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317096 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317164 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317192 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317215 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317303 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317335 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317364 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317393 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317419 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317441 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317463 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317486 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317506 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317524 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317544 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317565 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317586 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317606 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317626 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317646 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317664 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317686 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317709 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317730 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317768 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317791 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317817 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317847 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317875 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317902 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317938 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.325948 4860 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326014 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326051 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326082 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326107 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326134 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326156 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326175 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326200 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326218 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326265 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326283 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326299 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326319 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326335 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326350 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326370 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326387 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326408 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326424 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326440 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326459 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326485 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326515 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326537 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326562 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326592 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326616 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326645 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326665 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326684 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326709 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326734 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326760 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326782 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326800 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326827 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326850 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326874 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326893 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326911 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326958 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326977 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327000 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327020 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327044 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327065 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327086 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327111 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327133 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327164 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327185 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327205 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327268 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327289 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327317 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327338 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327361 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327388 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327407 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327437 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327463 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327489 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327518 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327539 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327566 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327584 4860 reconstruct.go:97] "Volume reconstruction finished" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327600 4860 reconciler.go:26] "Reconciler: start to sync state" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.335495 4860 manager.go:324] Recovery completed Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.346134 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348625 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349870 4860 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349895 4860 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349926 4860 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.366323 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.409888 4860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.411995 4860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.412062 4860 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.412093 4860 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.412150 4860 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.413015 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.413094 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.467202 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.505880 4860 policy_none.go:49] "None policy: Start" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.506861 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.507784 4860 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.507862 4860 state_mem.go:35] "Initializing new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.512410 4860 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557041 4860 manager.go:334] "Starting Device Plugin manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557115 4860 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557128 4860 server.go:79] "Starting device plugin registration server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557734 4860 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557757 4860 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557964 4860 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.558163 4860 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.558243 4860 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.568893 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.660585 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661749 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661846 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.662488 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.712826 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.712953 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714635 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714855 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.715320 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.715419 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716217 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716404 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716590 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717189 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717758 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717848 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717873 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718676 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719213 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719411 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719482 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720420 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720436 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721076 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721186 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721582 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724866 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724886 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835464 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835569 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835656 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835711 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835733 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835752 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835770 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835957 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836119 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836196 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836309 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836408 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836512 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836593 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.862767 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.864909 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.864983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.865008 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.865060 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.865729 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.908957 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938314 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938466 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938496 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938525 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938624 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938660 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938691 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938726 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938755 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938798 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938834 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938875 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938874 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938921 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938963 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938968 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938894 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939058 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938873 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939138 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939171 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939159 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939211 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939345 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939177 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.054950 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.079894 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.089435 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.105831 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.111274 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.212685 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff WatchSource:0}: Error finding container 5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff: Status 404 returned error can't find the container with id 5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.220744 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762 WatchSource:0}: Error finding container 4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762: Status 404 returned error can't find the container with id 4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762 Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.225797 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7 WatchSource:0}: Error finding container d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7: Status 404 returned error can't find the container with id d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7 Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.230562 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b WatchSource:0}: Error finding container b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b: Status 404 returned error can't find the container with id b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.231661 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d WatchSource:0}: Error finding container 976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d: Status 404 returned error can't find the container with id 976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.239264 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.239402 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.260040 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.266148 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267384 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267404 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267465 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.268000 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.298722 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.298823 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.417567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.419027 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.420477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.421912 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.423524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff"} Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.710270 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.733202 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.733409 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.772777 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.772908 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.068742 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070153 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070216 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070252 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070285 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:39 crc kubenswrapper[4860]: E0320 10:54:39.070793 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.088964 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:39 crc kubenswrapper[4860]: E0320 10:54:39.090388 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.260970 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.260178 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.311684 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.432178 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434718 4860 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434939 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436589 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436732 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436964 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437899 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437948 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438043 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438452 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440709 4860 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440837 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442142 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442169 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442617 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.457580 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.457692 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.671814 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.672957 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.672993 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.673005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.673031 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.673603 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.787524 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.787607 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.896244 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.896331 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.260070 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b411b2d78ae0ca6e465eafe2ca565d78630979ffc93ff9fb0785c70d42e4c447"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446716 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446734 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4edc7f588e4fdfa1c92fcf94e685925ef7708d48f6dc4a72363331f66f0b4ab7"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450599 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.452952 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453003 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453014 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453133 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457246 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69" exitCode=0 Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457406 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458513 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458528 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464492 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464557 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464579 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464601 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464709 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.468708 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.468576 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469632 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: W0320 10:54:41.500608 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:41 crc kubenswrapper[4860]: E0320 10:54:41.500720 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474644 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691" exitCode=0 Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691"} Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474800 4860 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474838 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474861 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474784 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474937 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.475028 4860 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.475141 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477140 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477533 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477594 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477927 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477982 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.638936 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.111187 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.437752 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.445648 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483375 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483478 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483505 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483523 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483658 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483520 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484640 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485334 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.874051 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876201 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876350 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.076190 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.076460 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078059 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078158 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.368014 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491316 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e"} Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491381 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491454 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491565 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492930 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492986 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493073 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.494419 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495934 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.648132 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.953215 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.953580 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955820 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955834 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.497302 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498708 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498771 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.900831 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.901064 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.367938 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.499536 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500510 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500521 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:47 crc kubenswrapper[4860]: E0320 10:54:47.569708 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.403029 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.404918 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408286 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408354 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.901736 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.901894 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.374269 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.374445 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376011 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376064 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.033144 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.033296 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.262018 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.516207 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.525826 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" exitCode=255 Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.525887 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89"} Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.526094 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527245 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527870 4860 scope.go:117] "RemoveContainer" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.116195 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.214460 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.214555 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.215878 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.215917 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.216369 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.219140 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.221530 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.223577 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.223660 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.223925 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.224026 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.226751 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.226853 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.229065 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.229152 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.264510 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.531693 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.534193 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd"} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.534417 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535333 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.272878 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:54Z is after 2026-02-23T05:33:13Z Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.373632 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]log ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]etcd ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiextensions-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/crd-informer-synced ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/bootstrap-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-registration-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]autoregister-completion ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: livez check failed Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.373707 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.539102 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.539543 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541345 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" exitCode=255 Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541403 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd"} Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541447 4860 scope.go:117] "RemoveContainer" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541702 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542969 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.544005 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:54 crc kubenswrapper[4860]: E0320 10:54:54.544555 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.263157 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:55Z is after 2026-02-23T05:33:13Z Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.547025 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.674721 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.675017 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676639 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676650 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.691696 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.270499 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:56Z is after 2026-02-23T05:33:13Z Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.555058 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556667 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556723 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556741 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4860]: I0320 10:54:57.264838 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:57Z is after 2026-02-23T05:33:13Z Mar 20 10:54:57 crc kubenswrapper[4860]: E0320 10:54:57.569965 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:58 crc kubenswrapper[4860]: I0320 10:54:58.263964 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:58Z is after 2026-02-23T05:33:13Z Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.263770 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.375579 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.375858 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377420 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.378208 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.378435 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.381330 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.562500 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.563987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564530 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.564692 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.616748 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618535 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618562 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.621405 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.625686 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.902280 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.902389 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:00 crc kubenswrapper[4860]: W0320 10:55:00.259153 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z Mar 20 10:55:00 crc kubenswrapper[4860]: E0320 10:55:00.259282 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:00 crc kubenswrapper[4860]: I0320 10:55:00.262191 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.077588 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.077812 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079349 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079429 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.080424 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.080712 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:01 crc kubenswrapper[4860]: W0320 10:55:01.179308 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.179422 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.262959 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.566340 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.570171 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:01 crc kubenswrapper[4860]: W0320 10:55:01.576694 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.576829 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.263430 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z Mar 20 10:55:02 crc kubenswrapper[4860]: W0320 10:55:02.637475 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z Mar 20 10:55:02 crc kubenswrapper[4860]: E0320 10:55:02.637890 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.639665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.639894 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641254 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.642395 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:02 crc kubenswrapper[4860]: E0320 10:55:02.642699 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:03 crc kubenswrapper[4860]: E0320 10:55:03.223737 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:03 crc kubenswrapper[4860]: I0320 10:55:03.262993 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:03Z is after 2026-02-23T05:33:13Z Mar 20 10:55:04 crc kubenswrapper[4860]: I0320 10:55:04.263413 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:04Z is after 2026-02-23T05:33:13Z Mar 20 10:55:05 crc kubenswrapper[4860]: I0320 10:55:05.264914 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:05Z is after 2026-02-23T05:33:13Z Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.265253 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.621519 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623115 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623193 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623307 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:06 crc kubenswrapper[4860]: E0320 10:55:06.627098 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:06 crc kubenswrapper[4860]: E0320 10:55:06.629689 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:07 crc kubenswrapper[4860]: I0320 10:55:07.265294 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2026-02-23T05:33:13Z Mar 20 10:55:07 crc kubenswrapper[4860]: E0320 10:55:07.570097 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:08 crc kubenswrapper[4860]: I0320 10:55:08.265220 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2026-02-23T05:33:13Z Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.266783 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2026-02-23T05:33:13Z Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902609 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902756 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902874 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.903140 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.905622 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.905809 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" gracePeriod=30 Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.266501 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2026-02-23T05:33:13Z Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.595510 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596039 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" exitCode=255 Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596109 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596166 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596292 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597638 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597704 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4860]: I0320 10:55:11.263986 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2026-02-23T05:33:13Z Mar 20 10:55:12 crc kubenswrapper[4860]: I0320 10:55:12.263804 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2026-02-23T05:33:13Z Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.228945 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.265996 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.627965 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629762 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629846 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629882 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.632601 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.632992 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:14 crc kubenswrapper[4860]: I0320 10:55:14.262629 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2026-02-23T05:33:13Z Mar 20 10:55:15 crc kubenswrapper[4860]: I0320 10:55:15.263752 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2026-02-23T05:33:13Z Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.262993 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2026-02-23T05:33:13Z Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.901118 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.901320 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905501 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.266055 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.269347 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.269436 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.305614 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.305715 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.368558 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.413681 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416116 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416365 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.417117 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.570369 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.620312 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.621886 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e"} Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.621962 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.622196 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623832 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624691 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.641554 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.641666 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.264881 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2026-02-23T05:33:13Z Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.431422 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.436085 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.437384 4860 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.626712 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.627470 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630500 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" exitCode=255 Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e"} Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630632 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630840 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632034 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632078 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632092 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632904 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.633143 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.264631 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2026-02-23T05:33:13Z Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.636267 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.902030 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.902129 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.264928 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.633395 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635335 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:20 crc kubenswrapper[4860]: E0320 10:55:20.636426 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:20 crc kubenswrapper[4860]: E0320 10:55:20.637937 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.077628 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.077823 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079117 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079161 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079841 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:21 crc kubenswrapper[4860]: E0320 10:55:21.080050 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.263464 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2026-02-23T05:33:13Z Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.262693 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2026-02-23T05:33:13Z Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.639700 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.639949 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641605 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641638 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641651 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.642405 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:22 crc kubenswrapper[4860]: E0320 10:55:22.642621 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:23 crc kubenswrapper[4860]: E0320 10:55:23.232818 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:23 crc kubenswrapper[4860]: I0320 10:55:23.263131 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2026-02-23T05:33:13Z Mar 20 10:55:24 crc kubenswrapper[4860]: I0320 10:55:24.265540 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2026-02-23T05:33:13Z Mar 20 10:55:25 crc kubenswrapper[4860]: I0320 10:55:25.262874 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:25Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: I0320 10:55:26.263441 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: W0320 10:55:26.695990 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: E0320 10:55:26.696090 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.264922 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.570609 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.638426 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640096 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.643491 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.643743 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:28 crc kubenswrapper[4860]: I0320 10:55:28.264430 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:28Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.264727 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.902261 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.902338 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:30 crc kubenswrapper[4860]: I0320 10:55:30.264866 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:30Z is after 2026-02-23T05:33:13Z Mar 20 10:55:31 crc kubenswrapper[4860]: I0320 10:55:31.263205 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:31Z is after 2026-02-23T05:33:13Z Mar 20 10:55:32 crc kubenswrapper[4860]: I0320 10:55:32.262524 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2026-02-23T05:33:13Z Mar 20 10:55:33 crc kubenswrapper[4860]: E0320 10:55:33.237408 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:33 crc kubenswrapper[4860]: I0320 10:55:33.262960 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2026-02-23T05:33:13Z Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.083003 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.083290 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084609 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084682 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084700 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.262947 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.644128 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.645987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646160 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:34 crc kubenswrapper[4860]: E0320 10:55:34.649758 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:34 crc kubenswrapper[4860]: E0320 10:55:34.651285 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:35 crc kubenswrapper[4860]: I0320 10:55:35.263390 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2026-02-23T05:33:13Z Mar 20 10:55:36 crc kubenswrapper[4860]: I0320 10:55:36.264876 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2026-02-23T05:33:13Z Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.263269 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2026-02-23T05:33:13Z Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.413268 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.419477 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:37 crc kubenswrapper[4860]: E0320 10:55:37.419679 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:37 crc kubenswrapper[4860]: E0320 10:55:37.571037 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:38 crc kubenswrapper[4860]: I0320 10:55:38.268695 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.264815 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.902697 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903160 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903297 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903478 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905637 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.906525 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.906841 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc" gracePeriod=30 Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.264102 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.699091 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701030 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701495 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc" exitCode=255 Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf2624c09c9ed0c88340cb5c33a9f304b84e7f10b768178bc03980768edd770b"} Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701832 4860 scope.go:117] "RemoveContainer" containerID="37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.702075 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.267736 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.652263 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654216 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654347 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654364 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654402 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:41 crc kubenswrapper[4860]: E0320 10:55:41.658054 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:41 crc kubenswrapper[4860]: E0320 10:55:41.658175 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.706862 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 10:55:42 crc kubenswrapper[4860]: I0320 10:55:42.265714 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.244827 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.248929 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.254626 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.259140 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.263939 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f39095a90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.559634576 +0000 UTC m=+1.780995504,LastTimestamp:2026-03-20 10:54:37.559634576 +0000 UTC m=+1.780995504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: I0320 10:55:43.264076 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.265703 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.661779981 +0000 UTC m=+1.883140879,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.270844 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.661804402 +0000 UTC m=+1.883165300,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.275770 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.661815242 +0000 UTC m=+1.883176140,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.282658 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.714601264 +0000 UTC m=+1.935962172,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.288347 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.714630825 +0000 UTC m=+1.935991713,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.291427 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.714640035 +0000 UTC m=+1.936000933,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.295963 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.716196341 +0000 UTC m=+1.937557239,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.300335 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.716212851 +0000 UTC m=+1.937573749,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.304811 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.716237651 +0000 UTC m=+1.937598549,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.310193 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717176103 +0000 UTC m=+1.938536991,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.314788 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717197144 +0000 UTC m=+1.938558042,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.318999 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717206964 +0000 UTC m=+1.938567862,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.323880 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717638014 +0000 UTC m=+1.938998912,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.329398 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717660774 +0000 UTC m=+1.939021672,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.334099 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717669384 +0000 UTC m=+1.939030282,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.340165 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717732256 +0000 UTC m=+1.939093154,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.344915 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717750926 +0000 UTC m=+1.939111824,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.350118 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717769097 +0000 UTC m=+1.939129995,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.354118 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.718692558 +0000 UTC m=+1.940053456,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.358405 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.718710698 +0000 UTC m=+1.940071596,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.364002 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874f60ce5e36 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.226857526 +0000 UTC m=+2.448218424,LastTimestamp:2026-03-20 10:54:38.226857526 +0000 UTC m=+2.448218424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.368406 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874f60cfff40 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.226964288 +0000 UTC m=+2.448325216,LastTimestamp:2026-03-20 10:54:38.226964288 +0000 UTC m=+2.448325216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.371609 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874f60fc9af8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.229887736 +0000 UTC m=+2.451248634,LastTimestamp:2026-03-20 10:54:38.229887736 +0000 UTC m=+2.451248634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.375086 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874f613b23e4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.23398602 +0000 UTC m=+2.455346948,LastTimestamp:2026-03-20 10:54:38.23398602 +0000 UTC m=+2.455346948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.378522 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874f6148876e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.23486347 +0000 UTC m=+2.456224408,LastTimestamp:2026-03-20 10:54:38.23486347 +0000 UTC m=+2.456224408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.382659 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd63a618b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.196870539 +0000 UTC m=+4.418231447,LastTimestamp:2026-03-20 10:54:40.196870539 +0000 UTC m=+4.418231447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.386386 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fd6a9cfd3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.204173267 +0000 UTC m=+4.425534215,LastTimestamp:2026-03-20 10:54:40.204173267 +0000 UTC m=+4.425534215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.389870 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fd6d960c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.207290569 +0000 UTC m=+4.428651477,LastTimestamp:2026-03-20 10:54:40.207290569 +0000 UTC m=+4.428651477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.394300 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fd6f64732 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.209184562 +0000 UTC m=+4.430545480,LastTimestamp:2026-03-20 10:54:40.209184562 +0000 UTC m=+4.430545480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.397985 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fd70c7ca3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.210640035 +0000 UTC m=+4.432000943,LastTimestamp:2026-03-20 10:54:40.210640035 +0000 UTC m=+4.432000943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.402152 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd7233740 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.2121296 +0000 UTC m=+4.433490508,LastTimestamp:2026-03-20 10:54:40.2121296 +0000 UTC m=+4.433490508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.405502 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd73d7a7d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,LastTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.408880 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fd798b6f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.219830006 +0000 UTC m=+4.441190914,LastTimestamp:2026-03-20 10:54:40.219830006 +0000 UTC m=+4.441190914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.412108 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fd7b5e46e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.22174219 +0000 UTC m=+4.443103098,LastTimestamp:2026-03-20 10:54:40.22174219 +0000 UTC m=+4.443103098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.415342 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fd854404f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.232120399 +0000 UTC m=+4.453481337,LastTimestamp:2026-03-20 10:54:40.232120399 +0000 UTC m=+4.453481337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.419057 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fd88077dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.235018205 +0000 UTC m=+4.456379113,LastTimestamp:2026-03-20 10:54:40.235018205 +0000 UTC m=+4.456379113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.424033 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fe4a4f59b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.438736283 +0000 UTC m=+4.660097191,LastTimestamp:2026-03-20 10:54:40.438736283 +0000 UTC m=+4.660097191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.427584 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fe4b9be96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.440098454 +0000 UTC m=+4.661459352,LastTimestamp:2026-03-20 10:54:40.440098454 +0000 UTC m=+4.661459352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.431288 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fe4d9cd2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.442199342 +0000 UTC m=+4.663560250,LastTimestamp:2026-03-20 10:54:40.442199342 +0000 UTC m=+4.663560250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.435312 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fe4ea8f79 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.443297657 +0000 UTC m=+4.664658595,LastTimestamp:2026-03-20 10:54:40.443297657 +0000 UTC m=+4.664658595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.440617 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fec61e627 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,LastTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.445214 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed485a3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,LastTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.450299 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed587aa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.584719014 +0000 UTC m=+4.806079912,LastTimestamp:2026-03-20 10:54:40.584719014 +0000 UTC m=+4.806079912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.455592 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff0ef6b36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.644942646 +0000 UTC m=+4.866303544,LastTimestamp:2026-03-20 10:54:40.644942646 +0000 UTC m=+4.866303544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.460360 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874ff0ffb71b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646010651 +0000 UTC m=+4.867371569,LastTimestamp:2026-03-20 10:54:40.646010651 +0000 UTC m=+4.867371569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.465114 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff106652f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646448431 +0000 UTC m=+4.867809329,LastTimestamp:2026-03-20 10:54:40.646448431 +0000 UTC m=+4.867809329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.469955 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874ff1069eff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646463231 +0000 UTC m=+4.867824119,LastTimestamp:2026-03-20 10:54:40.646463231 +0000 UTC m=+4.867824119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.474614 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff1ab3521 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.657249569 +0000 UTC m=+4.878610477,LastTimestamp:2026-03-20 10:54:40.657249569 +0000 UTC m=+4.878610477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.480302 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff1ba76d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.658249432 +0000 UTC m=+4.879610340,LastTimestamp:2026-03-20 10:54:40.658249432 +0000 UTC m=+4.879610340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.485544 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff213a284 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.664093316 +0000 UTC m=+4.885454224,LastTimestamp:2026-03-20 10:54:40.664093316 +0000 UTC m=+4.885454224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.490552 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff24616b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.667399862 +0000 UTC m=+4.888760760,LastTimestamp:2026-03-20 10:54:40.667399862 +0000 UTC m=+4.888760760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.495177 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874ff2a3a8f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.673532153 +0000 UTC m=+4.894893051,LastTimestamp:2026-03-20 10:54:40.673532153 +0000 UTC m=+4.894893051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.496749 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874ff2a3bd21 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.673537313 +0000 UTC m=+4.894898211,LastTimestamp:2026-03-20 10:54:40.673537313 +0000 UTC m=+4.894898211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.501318 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff83f1d2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.767606063 +0000 UTC m=+4.988966961,LastTimestamp:2026-03-20 10:54:40.767606063 +0000 UTC m=+4.988966961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.506830 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff934dcab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.783711403 +0000 UTC m=+5.005072301,LastTimestamp:2026-03-20 10:54:40.783711403 +0000 UTC m=+5.005072301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.513124 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff94892e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.785003232 +0000 UTC m=+5.006364130,LastTimestamp:2026-03-20 10:54:40.785003232 +0000 UTC m=+5.006364130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.518580 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffc11fd8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.831757706 +0000 UTC m=+5.053118604,LastTimestamp:2026-03-20 10:54:40.831757706 +0000 UTC m=+5.053118604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.523487 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffc9b0a72 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.840739442 +0000 UTC m=+5.062100340,LastTimestamp:2026-03-20 10:54:40.840739442 +0000 UTC m=+5.062100340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.530648 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffce68c94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.845687956 +0000 UTC m=+5.067048854,LastTimestamp:2026-03-20 10:54:40.845687956 +0000 UTC m=+5.067048854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.535707 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffd019412 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.847459346 +0000 UTC m=+5.068820244,LastTimestamp:2026-03-20 10:54:40.847459346 +0000 UTC m=+5.068820244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.540269 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffd58c5b8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.853173688 +0000 UTC m=+5.074534586,LastTimestamp:2026-03-20 10:54:40.853173688 +0000 UTC m=+5.074534586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.544579 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffd726746 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.854853446 +0000 UTC m=+5.076214344,LastTimestamp:2026-03-20 10:54:40.854853446 +0000 UTC m=+5.076214344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.549046 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875003aa3071 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.959172721 +0000 UTC m=+5.180533619,LastTimestamp:2026-03-20 10:54:40.959172721 +0000 UTC m=+5.180533619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.553497 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875004a41d7c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.975551868 +0000 UTC m=+5.196912766,LastTimestamp:2026-03-20 10:54:40.975551868 +0000 UTC m=+5.196912766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.558013 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87500692d6c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.007974082 +0000 UTC m=+5.229334980,LastTimestamp:2026-03-20 10:54:41.007974082 +0000 UTC m=+5.229334980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.562253 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e875006a56e84 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.00919258 +0000 UTC m=+5.230553478,LastTimestamp:2026-03-20 10:54:41.00919258 +0000 UTC m=+5.230553478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.567318 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875007d5e14c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.029144908 +0000 UTC m=+5.250505806,LastTimestamp:2026-03-20 10:54:41.029144908 +0000 UTC m=+5.250505806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.571446 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875007ed79fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.030691324 +0000 UTC m=+5.252052222,LastTimestamp:2026-03-20 10:54:41.030691324 +0000 UTC m=+5.252052222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.575639 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8750080b3a90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.032641168 +0000 UTC m=+5.254002066,LastTimestamp:2026-03-20 10:54:41.032641168 +0000 UTC m=+5.254002066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.580742 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750122e2f0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.202704143 +0000 UTC m=+5.424065041,LastTimestamp:2026-03-20 10:54:41.202704143 +0000 UTC m=+5.424065041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.585106 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875012f0e138 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.215463736 +0000 UTC m=+5.436824634,LastTimestamp:2026-03-20 10:54:41.215463736 +0000 UTC m=+5.436824634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.589475 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750130872d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,LastTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.593526 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501e35ce1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,LastTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.598543 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501ee0c999 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,LastTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.601538 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750218c4bad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.460530093 +0000 UTC m=+5.681890991,LastTimestamp:2026-03-20 10:54:41.460530093 +0000 UTC m=+5.681890991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.603732 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87502c96fe07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.645780487 +0000 UTC m=+5.867141385,LastTimestamp:2026-03-20 10:54:41.645780487 +0000 UTC m=+5.867141385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.606921 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87502df3733d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.668617021 +0000 UTC m=+5.889977919,LastTimestamp:2026-03-20 10:54:41.668617021 +0000 UTC m=+5.889977919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.611618 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87505e4bb584 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.479707524 +0000 UTC m=+6.701068432,LastTimestamp:2026-03-20 10:54:42.479707524 +0000 UTC m=+6.701068432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.615891 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750696709a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.666047913 +0000 UTC m=+6.887408811,LastTimestamp:2026-03-20 10:54:42.666047913 +0000 UTC m=+6.887408811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.619782 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875069ff3ab6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.676021942 +0000 UTC m=+6.897382850,LastTimestamp:2026-03-20 10:54:42.676021942 +0000 UTC m=+6.897382850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.624101 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87506a124406 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.67726951 +0000 UTC m=+6.898630408,LastTimestamp:2026-03-20 10:54:42.67726951 +0000 UTC m=+6.898630408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.628180 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87507696517e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.887250302 +0000 UTC m=+7.108611210,LastTimestamp:2026-03-20 10:54:42.887250302 +0000 UTC m=+7.108611210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.632065 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750774fb72b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.899400491 +0000 UTC m=+7.120761429,LastTimestamp:2026-03-20 10:54:42.899400491 +0000 UTC m=+7.120761429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.638251 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750776304ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.90066553 +0000 UTC m=+7.122026438,LastTimestamp:2026-03-20 10:54:42.90066553 +0000 UTC m=+7.122026438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.643057 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875082492402 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.083518978 +0000 UTC m=+7.304879866,LastTimestamp:2026-03-20 10:54:43.083518978 +0000 UTC m=+7.304879866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.647029 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875083347dac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.098942892 +0000 UTC m=+7.320303790,LastTimestamp:2026-03-20 10:54:43.098942892 +0000 UTC m=+7.320303790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.651245 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750834705a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.10015735 +0000 UTC m=+7.321518248,LastTimestamp:2026-03-20 10:54:43.10015735 +0000 UTC m=+7.321518248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.655127 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875092656d56 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.353808214 +0000 UTC m=+7.575169142,LastTimestamp:2026-03-20 10:54:43.353808214 +0000 UTC m=+7.575169142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.658442 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509329a424 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.3666673 +0000 UTC m=+7.588028208,LastTimestamp:2026-03-20 10:54:43.3666673 +0000 UTC m=+7.588028208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.663174 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509345a8a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.368503462 +0000 UTC m=+7.589864360,LastTimestamp:2026-03-20 10:54:43.368503462 +0000 UTC m=+7.589864360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.667897 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509d92a816 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.54132175 +0000 UTC m=+7.762682648,LastTimestamp:2026-03-20 10:54:43.54132175 +0000 UTC m=+7.762682648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.672029 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509e3ea95d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.552594269 +0000 UTC m=+7.773955167,LastTimestamp:2026-03-20 10:54:43.552594269 +0000 UTC m=+7.773955167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.678811 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e875218b0595c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:49.90183254 +0000 UTC m=+14.123193448,LastTimestamp:2026-03-20 10:54:49.90183254 +0000 UTC m=+14.123193448,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.683288 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875218b21afb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:49.901947643 +0000 UTC m=+14.123308561,LastTimestamp:2026-03-20 10:54:49.901947643 +0000 UTC m=+14.123308561,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.688832 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-apiserver-crc.189e875297babd86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:52.033219974 +0000 UTC m=+16.254580912,LastTimestamp:2026-03-20 10:54:52.033219974 +0000 UTC m=+16.254580912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.694132 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875297bcc89f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:52.033353887 +0000 UTC m=+16.254714825,LastTimestamp:2026-03-20 10:54:52.033353887 +0000 UTC m=+16.254714825,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.699023 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8750130872d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750130872d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,LastTimestamp:2026-03-20 10:54:52.52982466 +0000 UTC m=+16.751185558,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.702899 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87501e35ce1f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501e35ce1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,LastTimestamp:2026-03-20 10:54:52.716404542 +0000 UTC m=+16.937765460,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.705888 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87501ee0c999\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501ee0c999 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,LastTimestamp:2026-03-20 10:54:52.729470066 +0000 UTC m=+16.950830964,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.710008 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-apiserver-crc.189e8752deb48785 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:55:43 crc kubenswrapper[4860]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:43 crc kubenswrapper[4860]: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:53.223995269 +0000 UTC m=+17.445356187,LastTimestamp:2026-03-20 10:54:53.223995269 +0000 UTC m=+17.445356187,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.717177 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8752deb5a5c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:53.224068551 +0000 UTC m=+17.445429459,LastTimestamp:2026-03-20 10:54:53.224068551 +0000 UTC m=+17.445429459,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.722773 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.726876 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.733185 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:09.902721023 +0000 UTC m=+34.124081921,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.737703 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc52df4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:55:09.902814535 +0000 UTC m=+34.124175433,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.744055 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8756c1047be9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:09.905787881 +0000 UTC m=+34.127148779,LastTimestamp:2026-03-20 10:55:09.905787881 +0000 UTC m=+34.127148779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.747826 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fd73d7a7d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd73d7a7d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,LastTimestamp:2026-03-20 10:55:10.022453539 +0000 UTC m=+34.243814447,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.752122 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fec61e627\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fec61e627 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,LastTimestamp:2026-03-20 10:55:10.167997924 +0000 UTC m=+34.389358822,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.755597 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fed485a3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed485a3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,LastTimestamp:2026-03-20 10:55:10.176287726 +0000 UTC m=+34.397648624,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.761592 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:19.902094399 +0000 UTC m=+44.123455317,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.766440 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc52df4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:55:19.902722495 +0000 UTC m=+44.124083413,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.771474 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:29.902317716 +0000 UTC m=+54.123678624,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:44 crc kubenswrapper[4860]: I0320 10:55:44.266254 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:45 crc kubenswrapper[4860]: I0320 10:55:45.263050 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.263492 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.901729 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.901916 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.902956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.903001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.903009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.264938 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.368832 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:47 crc kubenswrapper[4860]: E0320 10:55:47.571587 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.726265 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727398 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727456 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727474 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.264707 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.658551 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661671 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661719 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:48 crc kubenswrapper[4860]: E0320 10:55:48.663823 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:48 crc kubenswrapper[4860]: E0320 10:55:48.664152 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.732844 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.732996 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734280 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734306 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734317 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.270265 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.413267 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414758 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414832 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.415720 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.731972 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.733734 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa"} Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.733936 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735072 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735129 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: W0320 10:55:49.949023 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4860]: E0320 10:55:49.949086 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.263725 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.439697 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.466848 4860 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.738560 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.739113 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740735 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" exitCode=255 Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740779 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa"} Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740819 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740972 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.748504 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:50 crc kubenswrapper[4860]: E0320 10:55:50.748728 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.077277 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.273364 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.746047 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749002 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749866 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749908 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.750554 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:51 crc kubenswrapper[4860]: E0320 10:55:51.750852 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.260894 4860 csr.go:261] certificate signing request csr-66mx8 is approved, waiting to be issued Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.264097 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.268530 4860 csr.go:257] certificate signing request csr-66mx8 is issued Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.307531 4860 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.639437 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.728733 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.751836 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753154 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.754213 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:52 crc kubenswrapper[4860]: E0320 10:55:52.754446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.963080 4860 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 10:55:52 crc kubenswrapper[4860]: W0320 10:55:52.963342 4860 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 20 10:55:53 crc kubenswrapper[4860]: I0320 10:55:53.270972 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 20:20:01.884747405 +0000 UTC Mar 20 10:55:53 crc kubenswrapper[4860]: I0320 10:55:53.271026 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6393h24m8.613725185s for next certificate rotation Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.664692 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666603 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666709 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.679597 4860 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.680132 4860 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.680169 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685210 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685300 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685374 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.706546 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716657 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716737 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716752 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.731838 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743406 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743474 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743499 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743516 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.762409 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.772948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773040 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773058 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792484 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792749 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792800 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.893880 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.995007 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.095410 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.196397 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.296906 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.398055 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.498587 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.598891 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.699932 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.800706 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.901044 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.001575 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.102326 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.203323 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.303968 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.374913 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.375167 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377338 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377441 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.404820 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.505610 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.572288 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.606613 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.707734 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.808846 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.909949 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.010841 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.111062 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.211271 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.311888 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.413147 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.514108 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.614679 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.715473 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.815793 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.916886 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.018029 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.119087 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.220084 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.320907 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.421089 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.521832 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.622929 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.724110 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.825191 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.926410 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.027023 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.127304 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.228302 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.329285 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.429975 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.530662 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.631720 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.732433 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.833052 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.934301 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.035217 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.136317 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.237389 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.338064 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.438514 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.539459 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.639968 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.741041 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.841540 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.941642 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.042146 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.142426 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.243719 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.344697 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.413066 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415610 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.445322 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.545952 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.646847 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.689423 4860 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.747906 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.848355 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.949106 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.050339 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.150602 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.251735 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.343013 4860 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354883 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.355020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.355044 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459267 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459880 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459941 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563124 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563196 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563217 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563249 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666379 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666402 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666454 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769886 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.770011 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873348 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873372 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977543 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977640 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977652 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081746 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081771 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081813 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185473 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185489 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185548 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288908 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288976 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288998 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.289011 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.299084 4860 apiserver.go:52] "Watching apiserver" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.305261 4860 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.305788 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306344 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306396 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.306471 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306690 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.306778 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307083 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307561 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307579 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.307688 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.312128 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.312891 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.320115 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.321661 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.321992 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322221 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322477 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322570 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322751 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.346525 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.362254 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.367160 4860 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.376054 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.388365 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389693 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389774 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389802 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389834 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389862 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389894 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389924 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389951 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389979 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390006 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390032 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390115 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390164 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390189 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390277 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390303 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390362 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391131 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390243 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391557 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391716 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391752 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391954 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392596 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392615 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392606 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392841 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392987 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393122 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393166 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393303 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393276 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391504 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393284 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393754 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394120 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394105 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394724 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394505 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394921 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395353 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395391 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395471 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395527 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395683 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395598 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395894 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395896 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396203 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395938 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396378 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397049 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397083 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396826 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397205 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396550 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397294 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397302 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397737 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398083 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398174 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398199 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398261 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398303 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398344 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398382 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398412 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398443 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398477 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398510 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398549 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398580 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398608 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.400663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.400891 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401056 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401092 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401116 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401137 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401176 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401194 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401213 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401252 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401270 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401291 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401312 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401335 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401353 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401370 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401427 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401470 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401491 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401508 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401638 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401659 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401679 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401704 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401731 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401758 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401781 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401800 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401819 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401837 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401855 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401873 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401891 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401913 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401932 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401952 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401972 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401971 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401993 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402014 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402033 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402060 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402080 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402099 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402118 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402142 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402161 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402180 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402201 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402245 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402268 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402291 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402317 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402341 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402359 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402375 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402457 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402478 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402503 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402529 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402550 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402575 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402590 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402637 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402886 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402601 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403038 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403074 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403100 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403126 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403149 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403169 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403253 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403282 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403342 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403363 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403382 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403400 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403418 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403453 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403473 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403488 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403506 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403522 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403407 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403542 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403580 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403613 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403631 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403672 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403689 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403712 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403732 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403750 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403769 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403786 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403804 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403823 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403818 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403842 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403828 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403871 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403906 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403888 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403927 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404005 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404070 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404055 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404115 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404181 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404210 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404241 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404267 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404302 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404360 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404386 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404412 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404475 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404501 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404532 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404621 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404721 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404759 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404677 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406419 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406452 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406481 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406509 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406537 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406589 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406613 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406635 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406655 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406676 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406712 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406735 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406755 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406775 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406813 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406841 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406867 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406892 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406914 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406941 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406971 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406988 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407007 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407046 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407073 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407240 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407271 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407315 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407465 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407490 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407593 4860 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407612 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407626 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407640 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407655 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407669 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407683 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407696 4860 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407713 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407729 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407746 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407762 4860 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407778 4860 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407793 4860 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407807 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407820 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407834 4860 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407849 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407861 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407873 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407886 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407897 4860 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407910 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407924 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407938 4860 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407951 4860 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407965 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407977 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407991 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408006 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408018 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408034 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408048 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408060 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408074 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408088 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408102 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408125 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408140 4860 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408152 4860 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408167 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408182 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408196 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408210 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408245 4860 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408260 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404261 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404337 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404673 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404861 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.405511 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.405943 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409398 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406155 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406643 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406669 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406694 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406789 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407004 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407214 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407584 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407615 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407690 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407744 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408078 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408094 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408347 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408410 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408511 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408786 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408825 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409036 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409130 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.409184 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.909149032 +0000 UTC m=+89.130509930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409728 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410129 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410196 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410209 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410472 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410566 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410671 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410737 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410933 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410989 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411099 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411103 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411154 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411166 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411342 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411665 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411679 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411937 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.412676 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413188 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413264 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413293 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413327 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413386 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413398 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413633 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413789 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413790 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414153 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414169 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414298 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414328 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414488 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414630 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.414642 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.414706 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.914691181 +0000 UTC m=+89.136052279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414893 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415014 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415025 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415104 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415392 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415462 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.415661 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.415736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.915714979 +0000 UTC m=+89.137075867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415899 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416030 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416269 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416309 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416705 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417165 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417280 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417568 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417676 4860 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417869 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417961 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418129 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418429 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418431 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418595 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418623 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418739 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419376 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419635 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419803 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420076 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420703 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420993 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421359 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421760 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421943 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422711 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422743 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422782 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422836 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.423016 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.423317 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424447 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424885 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424733 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.427434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.429744 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.429876 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.433536 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434926 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434956 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434972 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.435067 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.935038757 +0000 UTC m=+89.156399655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.435162 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.435970 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436138 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436966 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437206 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437697 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437905 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.438368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.442180 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443901 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443931 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443945 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.444021 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.943999597 +0000 UTC m=+89.165360495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.446303 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.446341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.447716 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448151 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448385 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448473 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448815 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448915 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449379 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449601 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449682 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449710 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450006 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450123 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450370 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450412 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450676 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450760 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450982 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.451489 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.451617 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452096 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452135 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452127 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452414 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452583 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452917 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453076 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453255 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453650 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.454075 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.460925 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.463880 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.473551 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.474382 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507841 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507879 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507902 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508642 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508711 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508779 4860 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508809 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508825 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508838 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508850 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508864 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508852 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508892 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508876 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509004 4860 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509021 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509037 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509051 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509064 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509079 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509094 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509107 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509119 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509132 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509151 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509173 4860 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509186 4860 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509199 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509212 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509250 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509265 4860 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509278 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509291 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509305 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509319 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509333 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509348 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509361 4860 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509374 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509387 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509403 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509417 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509432 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509444 4860 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509458 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509472 4860 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509485 4860 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509497 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509512 4860 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509528 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509546 4860 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509561 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509574 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509610 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509624 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509636 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509648 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509661 4860 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509675 4860 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509688 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509700 4860 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509712 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509724 4860 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509735 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509746 4860 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509759 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509772 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509783 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509797 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509809 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509821 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509833 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509846 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509858 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509871 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509883 4860 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509896 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509908 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509920 4860 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509933 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509945 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509961 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509974 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509987 4860 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509998 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510009 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510021 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510033 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510045 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510057 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510077 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510088 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510100 4860 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510111 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510123 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510134 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510147 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510159 4860 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510171 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510182 4860 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510193 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510204 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510215 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510251 4860 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510263 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510274 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510287 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510299 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510310 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510321 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510334 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510345 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510357 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510368 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510379 4860 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510390 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510401 4860 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510414 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510427 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510438 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510448 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510459 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510476 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510486 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510497 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510509 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510520 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510530 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510543 4860 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510564 4860 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510574 4860 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510593 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510604 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510617 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510628 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510639 4860 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510651 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510664 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510677 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510691 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510702 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510712 4860 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510724 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510736 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510748 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510760 4860 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510772 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510785 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510798 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510811 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510823 4860 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510838 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510850 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510865 4860 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510880 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610867 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610899 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.626188 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.643132 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.657311 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: W0320 10:56:04.687844 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1 WatchSource:0}: Error finding container 571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1: Status 404 returned error can't find the container with id 571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1 Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716313 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716367 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716378 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716412 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.787755 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.789342 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"63997ce9f6f9fd2913962a005d896518c8716e997da0a38f4c75591bd1349459"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.790389 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6667f283d81a023cc34d25c0b8cc71760accc6925c987ad4f43175d0ea1ed791"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819115 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819151 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819160 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819177 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819187 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.913648 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.913809 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:05.91376365 +0000 UTC m=+90.135124548 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922453 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014751 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014821 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014858 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014897 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.014933 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015012 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.014994476 +0000 UTC m=+90.236355374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015023 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015046 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015058 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015062 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015087 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015091 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.015082268 +0000 UTC m=+90.236443166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015102 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015134 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015162 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.01514456 +0000 UTC m=+90.236505478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015186 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.015174221 +0000 UTC m=+90.236535129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024606 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024651 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024668 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024686 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024698 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127955 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231396 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231467 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231562 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335094 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335178 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335257 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.413641 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.413833 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.421871 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.422411 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.423175 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.423812 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.424456 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.424963 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.425618 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.426135 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.426783 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.427290 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.427763 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.428476 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.429039 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.429717 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.430987 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.432809 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.434827 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.436682 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.438068 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439788 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439804 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439806 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439855 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.440059 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.440964 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.441795 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.442400 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.443299 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.443891 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.444534 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.445299 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.445959 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.446827 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.448779 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.449709 4860 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.449892 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.452032 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.452812 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.453768 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.455484 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.456530 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.457365 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.458469 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.460472 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.461214 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.462990 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.463750 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.464855 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.465448 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.466587 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.467447 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.468950 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.469760 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.471002 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.471788 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.472536 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.473851 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.474598 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543816 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543834 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543848 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647796 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647858 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647900 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750731 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750829 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.795510 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.797928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.797983 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.812043 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.833144 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.850661 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854245 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854291 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854343 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.866160 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.880078 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.897553 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.914584 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.923998 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.924146 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:07.924115396 +0000 UTC m=+92.145476314 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.930982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.954030 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957777 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957825 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.971043 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.986075 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.006555 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024835 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024895 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024948 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025031 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025071 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025073 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025031 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025113 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025129 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025103245 +0000 UTC m=+92.246464153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025088 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025148 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025135 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025153 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025142226 +0000 UTC m=+92.246503124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025255 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025219539 +0000 UTC m=+92.246580447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025292 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.02527632 +0000 UTC m=+92.246637228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050132 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050235 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.064958 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068419 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.084051 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087691 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087728 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087755 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087766 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.102180 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106511 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106553 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106564 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106597 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.133856 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141696 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141778 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141792 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.194965 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.195090 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197366 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300450 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300461 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300492 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.402893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403069 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403085 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.413329 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.413456 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.413575 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.413642 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505473 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505493 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505505 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608325 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608403 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608444 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.710985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711033 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711060 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813799 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813895 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813913 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813957 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916918 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916932 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916953 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916967 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021513 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021543 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021565 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021580 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127345 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127424 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127441 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127454 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231185 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231277 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231313 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231327 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334053 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334111 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334125 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334147 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334166 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.413388 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.413769 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.428596 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.428975 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.430647 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436548 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436567 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436592 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436611 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.439993 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.461852 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.482079 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.503805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.523956 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.538994 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539252 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539212 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539571 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.641981 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642061 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642079 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642089 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744784 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744801 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.806498 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.807157 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.807372 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.824777 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.844605 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847366 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847378 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847404 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.859664 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.873757 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.886291 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.897192 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.908928 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.942218 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.942546 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:11.942474774 +0000 UTC m=+96.163835712 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950109 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950156 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042812 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042889 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042915 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042939 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.042998 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043019 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043024 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043026 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043041 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043034 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043083 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043086 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043088 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043073693 +0000 UTC m=+96.264434591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043125 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043113474 +0000 UTC m=+96.264474372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043142 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043131775 +0000 UTC m=+96.264492673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043154 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043147865 +0000 UTC m=+96.264508763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052922 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052954 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052982 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052995 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156218 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259766 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259861 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259879 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259919 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363793 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363910 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.412842 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.412859 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.413098 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.413136 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466605 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466628 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466651 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569551 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569646 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569664 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673371 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776126 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776136 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879734 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879756 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982880 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086423 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086509 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086553 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189862 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189927 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189954 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.190007 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292509 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292524 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292544 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292562 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.396944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397044 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397101 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.413046 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:09 crc kubenswrapper[4860]: E0320 10:56:09.413288 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499527 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.602972 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603064 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705918 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705929 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705968 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808735 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808746 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808764 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808777 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.910948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.910997 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911030 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014128 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014154 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014174 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116644 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116757 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.219928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220037 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220049 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323032 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323083 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323098 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323111 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.412648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.412709 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:10 crc kubenswrapper[4860]: E0320 10:56:10.412848 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:10 crc kubenswrapper[4860]: E0320 10:56:10.413024 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426048 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426141 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426190 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529396 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529439 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632156 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632415 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737489 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737525 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841286 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841416 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944472 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944502 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047486 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047547 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047570 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150863 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150925 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150991 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.151016 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254291 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254461 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.356933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.356994 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357045 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.412857 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:11 crc kubenswrapper[4860]: E0320 10:56:11.413452 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.427327 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460125 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460196 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460207 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563566 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563583 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563593 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668168 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668253 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668273 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668285 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771263 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771367 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771466 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875119 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875204 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875260 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875311 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978968 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978990 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.979017 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.979039 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.980109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:11 crc kubenswrapper[4860]: E0320 10:56:11.980358 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:19.980320291 +0000 UTC m=+104.201681229 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080834 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080887 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081002 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081124 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081166 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081208 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081379 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081406 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081191 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081173607 +0000 UTC m=+104.302534505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081465 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081436644 +0000 UTC m=+104.302797552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081507 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081476345 +0000 UTC m=+104.302837273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081603 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081623 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081849 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.08166937 +0000 UTC m=+104.303030348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082538 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185623 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185781 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288544 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288628 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391675 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391703 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.413032 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.413576 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.413704 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.414055 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494874 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494946 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598246 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598315 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598372 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701642 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701659 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805748 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805766 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805816 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909757 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012686 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012789 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012824 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.115940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116006 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116022 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116047 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116063 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220193 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220344 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220362 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323910 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323942 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323966 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.413361 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:13 crc kubenswrapper[4860]: E0320 10:56:13.413547 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426068 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426133 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426145 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426194 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529302 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529338 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529361 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632163 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632414 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632439 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.734979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735032 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735063 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837697 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940630 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940668 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940684 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042752 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042785 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147289 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147386 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250187 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250204 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250217 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353302 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353390 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353416 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353434 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.413252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.413371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:14 crc kubenswrapper[4860]: E0320 10:56:14.413714 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:14 crc kubenswrapper[4860]: E0320 10:56:14.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457972 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.458036 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561103 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561167 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561190 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561209 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663744 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663793 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663823 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663835 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767362 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767403 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767448 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869629 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869654 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.972915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.972986 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973006 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973034 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973055 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076632 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076749 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076812 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179478 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179503 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179516 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283134 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283179 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283198 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387131 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387163 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:15 crc kubenswrapper[4860]: E0320 10:56:15.412654 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490191 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490240 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490260 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592738 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592759 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695500 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695564 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695586 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695602 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798714 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798798 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798824 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798842 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901683 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901760 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901789 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901806 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004960 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004969 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108156 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108309 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108343 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108368 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212023 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212133 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212184 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212218 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212282 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312955 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.313009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.313030 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.338471 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.345945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346039 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346086 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.381379 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389483 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389541 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.412524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.412522 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.412731 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.412870 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.424483 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429503 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429540 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429566 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.445260 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449201 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449215 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449249 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449263 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.464807 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.465260 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468597 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468667 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468699 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571753 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571823 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571840 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571864 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571879 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674851 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674923 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674962 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778457 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778540 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882123 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882158 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882171 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985495 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985592 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985633 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089094 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089107 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089127 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089142 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191693 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295500 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399384 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399416 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399438 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:17 crc kubenswrapper[4860]: E0320 10:56:17.413279 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.437296 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.459079 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.482666 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502732 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502759 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502770 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.508585 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.533793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.554052 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.574149 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.590218 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605670 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709547 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709567 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709589 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813445 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813489 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917905 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020775 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020822 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020840 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020882 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124925 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.125017 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.125031 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228799 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228907 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332324 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332410 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332451 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:18 crc kubenswrapper[4860]: E0320 10:56:18.412805 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:18 crc kubenswrapper[4860]: E0320 10:56:18.413084 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435907 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435936 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435949 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538730 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642064 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642083 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642126 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746173 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746271 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746295 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746349 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848978 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.849001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.849009 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951493 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951552 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054539 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054582 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054593 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054612 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054624 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157505 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157548 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260195 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260996 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364209 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364248 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364259 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.413608 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:19 crc kubenswrapper[4860]: E0320 10:56:19.413882 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.414083 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:19 crc kubenswrapper[4860]: E0320 10:56:19.414281 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466715 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466776 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466819 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569733 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569774 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569785 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569801 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569813 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673054 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673211 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673277 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775858 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775877 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775892 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878189 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878312 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878323 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.980940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.980989 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981000 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981029 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.058709 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.058873 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.058854582 +0000 UTC m=+120.280215480 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084649 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084700 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084741 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160219 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160299 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160366 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160423 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160459 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160468 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160485 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160492 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160500 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160503 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160502 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160480379 +0000 UTC m=+120.381841277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160466 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160548 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.16053703 +0000 UTC m=+120.381897938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160570 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160558671 +0000 UTC m=+120.381919569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160588 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160580021 +0000 UTC m=+120.381940919 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187575 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187584 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290071 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290134 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290144 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393434 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393504 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393564 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.413273 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.413566 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.413289 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.413981 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496198 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496579 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496794 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599439 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599556 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599575 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702458 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702521 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702548 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702558 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805528 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.806142 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909433 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012769 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012779 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012796 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012807 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115541 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115585 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115602 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219260 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219272 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219309 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322339 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322398 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322417 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322430 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.413316 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:21 crc kubenswrapper[4860]: E0320 10:56:21.413520 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425036 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425079 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425105 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425116 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527644 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527744 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630086 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630112 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630122 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.642084 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-srbpg"] Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.642483 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.644806 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.645175 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.646870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.657670 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.671084 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.684217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.696832 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.708413 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.721135 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732761 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732840 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.733772 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.758011 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.774861 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.778285 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.778369 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835821 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835924 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835946 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835966 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879501 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879752 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.900498 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940070 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940129 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940143 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940163 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940179 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.955429 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: W0320 10:56:21.971837 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e597b5_a377_4988_8c59_eeace5ffa4e4.slice/crio-09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03 WatchSource:0}: Error finding container 09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03: Status 404 returned error can't find the container with id 09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.024672 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kvdqp"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.025125 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.027819 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028191 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028444 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cmc44"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028460 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028603 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028621 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028800 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wpj5w"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.029103 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.029371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.030818 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031047 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031284 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031332 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031796 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031983 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.032329 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043518 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043542 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043553 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043666 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.058155 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.070829 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.091617 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.106818 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.121907 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.137491 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145958 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145973 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145986 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.150212 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.164839 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.179727 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182156 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182219 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182328 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182361 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182386 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182411 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182446 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182477 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182511 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182543 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182570 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182633 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182664 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182723 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182759 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182785 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182817 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182938 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183052 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183114 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183153 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183184 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183211 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183281 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183310 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183337 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183370 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183420 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.197145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.214905 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.232694 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248699 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248758 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248776 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248788 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.249992 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.273161 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284346 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284397 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284445 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284469 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284494 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284514 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284536 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284533 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284658 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284737 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284773 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284780 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284806 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284843 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284871 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284999 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285020 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285037 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285074 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285090 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285088 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285114 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285125 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285156 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285131 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285185 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285209 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285240 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285285 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285353 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285390 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285273 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285420 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285446 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285514 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285581 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285608 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285969 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285968 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286160 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286591 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289541 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289888 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.303142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.303602 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.305750 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.305922 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.319099 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.334441 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.343412 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.349874 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352101 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.353857 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.355028 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a9df230_75a1_4b64_8d00_c179e9c19080.slice/crio-196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3 WatchSource:0}: Error finding container 196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3: Status 404 returned error can't find the container with id 196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.355924 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.366572 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89c8af2_338f_401f_aad5_c6d7763a3b3a.slice/crio-87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177 WatchSource:0}: Error finding container 87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177: Status 404 returned error can't find the container with id 87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.367811 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.377285 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod329ab518_a391_4483_8373_1329318b58da.slice/crio-86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9 WatchSource:0}: Error finding container 86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9: Status 404 returned error can't find the container with id 86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.383790 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.406111 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.407096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.409382 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411370 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411438 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411567 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411378 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412014 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412154 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412357 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:22 crc kubenswrapper[4860]: E0320 10:56:22.412446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.414109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:22 crc kubenswrapper[4860]: E0320 10:56:22.414286 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.428034 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.445578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.455006 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.460004 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.475329 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.490116 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.505445 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.521008 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.537165 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558314 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558359 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558398 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.560118 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.576019 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589165 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589155 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589210 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589357 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589377 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589404 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589424 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589503 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589536 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589558 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589599 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589614 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589631 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589665 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589860 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589887 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589916 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589938 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589967 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589989 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.590017 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.603709 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.621027 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.660933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.660992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661002 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661031 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691392 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691435 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691461 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691482 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691498 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691508 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691529 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691560 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691571 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691623 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691647 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691672 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691660 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691724 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691722 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691692 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691854 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691880 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691934 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692052 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692085 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692200 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692884 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692915 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692944 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692937 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692759 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692666 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692994 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693012 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692778 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692987 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692318 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693043 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693090 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.696717 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.710513 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.740111 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763400 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763412 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.772825 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb85f6f9_1c0f_4388_9464_25dfe48d8d0f.slice/crio-f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7 WatchSource:0}: Error finding container f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7: Status 404 returned error can't find the container with id f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.854612 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.854722 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.856540 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.856574 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.858272 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860089 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860181 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860198 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.863860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-srbpg" event={"ID":"93e597b5-a377-4988-8c59-eeace5ffa4e4","Type":"ContainerStarted","Data":"299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.863977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-srbpg" event={"ID":"93e597b5-a377-4988-8c59-eeace5ffa4e4","Type":"ContainerStarted","Data":"09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865602 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865612 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865631 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865644 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.872646 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.888829 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.901031 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.913995 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.932396 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.957571 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968307 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968361 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968391 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968403 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.972306 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.984417 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.995621 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.010384 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.027178 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.043957 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.060998 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070802 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070877 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070917 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070932 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.077059 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.094193 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.111881 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.126715 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.144106 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.169437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173458 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173497 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173505 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173530 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.189881 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.211415 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.232197 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.247022 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.268767 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275442 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275476 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275485 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275499 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275509 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.287076 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.301095 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378446 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.413500 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:23 crc kubenswrapper[4860]: E0320 10:56:23.413635 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481616 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481626 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481654 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.584988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585048 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585061 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585106 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688685 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688713 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791633 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791681 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791727 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.871362 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" exitCode=0 Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.871457 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.873828 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d" exitCode=0 Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.873886 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.892649 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897725 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897825 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897912 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.917890 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.935358 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.952736 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.966661 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.981836 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001168 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001213 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001277 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001317 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.006193 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.030776 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.045913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.059619 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.072475 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.087944 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105311 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105354 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105438 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105470 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105485 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.108431 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.125074 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.139977 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.153765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.171440 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.194110 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208911 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208925 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.212219 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.236437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.250706 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.265983 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.280547 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.293629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.308980 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311268 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311298 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311309 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.326965 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.412314 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.412330 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:24 crc kubenswrapper[4860]: E0320 10:56:24.412454 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:24 crc kubenswrapper[4860]: E0320 10:56:24.412974 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413914 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413926 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517475 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517842 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517852 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517868 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517880 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620591 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620609 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620624 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730772 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730824 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730849 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730887 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837533 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837546 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837568 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837583 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.879346 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884002 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884062 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884072 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.904324 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.920299 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.934523 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940813 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940930 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.941095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.941270 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.949345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.968541 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.985853 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.001827 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.015363 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.037629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043699 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043710 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043727 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043739 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.059583 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.075021 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.090938 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.115307 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146675 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146720 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146735 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146770 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249306 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249326 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352113 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352194 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352206 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.413539 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:25 crc kubenswrapper[4860]: E0320 10:56:25.413810 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454701 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454746 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557707 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660545 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763362 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763401 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866981 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866991 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.900614 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a" exitCode=0 Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.900704 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.906991 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.907043 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.907056 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.919365 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.936839 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.958838 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969637 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969658 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969675 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.982790 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.995965 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.008729 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.027431 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.043610 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.062385 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071863 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071892 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071907 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.077159 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.091745 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.107650 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.122134 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175897 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175962 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175974 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.176010 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279336 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279351 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279387 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382149 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382192 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.413449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.413449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.413636 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.413669 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485914 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485932 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485952 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485965 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514078 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514121 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514139 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.531312 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537298 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537322 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537336 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.554042 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559246 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559304 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.574492 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579185 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579256 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579311 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.592563 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597066 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597111 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597124 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597158 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.610883 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.611017 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613377 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613421 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613438 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613460 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716254 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716270 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716301 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819383 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819469 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819526 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.914997 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0" exitCode=0 Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.915073 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922487 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922536 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.933847 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.951070 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.971805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.987202 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.008249 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.023942 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031102 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031157 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031276 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031307 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031329 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.047152 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.061398 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.072734 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.084289 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.097568 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.110524 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.123869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133685 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133993 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.134200 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.134429 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237497 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237782 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237987 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341335 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.412882 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:27 crc kubenswrapper[4860]: E0320 10:56:27.413073 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.433117 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.444921 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.444983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445088 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.457172 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.477046 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.490762 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.506245 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.524720 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.540501 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.547950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548063 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548083 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.562880 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.582960 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.598850 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.612730 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.627229 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.649789 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650324 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650380 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650417 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650430 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759080 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759258 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759284 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759345 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.861957 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862064 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.923017 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52" exitCode=0 Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.923384 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.947520 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965642 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965658 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965683 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.966125 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.978852 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.992668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.011929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.028943 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.050688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069463 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069670 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069632 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069689 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.089131 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.106037 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.127581 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.152428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172828 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172838 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172857 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172868 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.177414 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276422 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276435 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276454 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276465 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.356398 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tggrc"] Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.356907 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.360292 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.361382 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.361408 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.362295 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.378572 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379138 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379148 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379184 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.394121 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.409024 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.412837 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.412913 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:28 crc kubenswrapper[4860]: E0320 10:56:28.412975 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:28 crc kubenswrapper[4860]: E0320 10:56:28.413073 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.421794 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.442866 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458785 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458836 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.462275 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.476947 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485860 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.489885 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.502283 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.514586 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.535615 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.555519 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559858 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559925 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559953 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.560018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.561839 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.579477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589456 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589465 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589489 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.590190 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.594289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.674750 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: W0320 10:56:28.688081 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d9c3a3_4ed8_43ec_bb4a_fc1d49784105.slice/crio-26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654 WatchSource:0}: Error finding container 26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654: Status 404 returned error can't find the container with id 26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654 Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691208 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691220 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691240 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691271 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794098 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794140 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794182 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898486 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898501 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.928162 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tggrc" event={"ID":"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105","Type":"ContainerStarted","Data":"26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.932053 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001572 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001600 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104881 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104894 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.206988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207042 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207098 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309893 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.329208 4860 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412396 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:29 crc kubenswrapper[4860]: E0320 10:56:29.412558 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412964 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.413009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.413021 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515411 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515437 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515458 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.540606 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.618978 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619579 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619599 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619613 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721974 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.722024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.722043 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825881 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825943 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825979 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929227 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929292 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929322 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929333 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.936286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tggrc" event={"ID":"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105","Type":"ContainerStarted","Data":"5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.942634 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.943092 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.943353 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.948083 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.955153 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.972260 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.994121 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.013263 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.022182 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.031732 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032711 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.045936 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.057217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.070987 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.085345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.100941 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.121018 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135162 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135173 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.136693 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.149788 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.161040 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.177868 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.203841 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.221683 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.233660 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237801 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237828 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.247238 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.261174 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.274924 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.288367 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.301838 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.318631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.334124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340589 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340629 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340641 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340676 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.351148 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.368668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.388819 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.413465 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.413548 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:30 crc kubenswrapper[4860]: E0320 10:56:30.413675 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:30 crc kubenswrapper[4860]: E0320 10:56:30.413959 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443999 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.444020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.444035 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546769 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546857 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652733 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652800 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652816 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652845 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652858 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756059 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756132 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756151 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756164 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859440 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859453 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859472 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859486 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.951921 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961976 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961989 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.962008 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.962020 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.980082 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.994762 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.005132 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.017700 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.030405 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.047020 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065587 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065601 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065638 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.069320 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.083869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.095843 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.106543 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.119973 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.133071 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.153634 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.167443 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.167966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168054 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.179673 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270482 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270574 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270616 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270628 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374576 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374590 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374618 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374636 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.412567 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:31 crc kubenswrapper[4860]: E0320 10:56:31.412895 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.413199 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478380 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478442 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581096 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581652 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581662 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684762 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684835 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684856 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787545 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787658 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.896906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.896975 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897027 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897043 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.957435 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef" exitCode=0 Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.958365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.975915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.993641 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008043 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008103 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008136 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.020957 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.037453 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.050825 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.066752 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.082796 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.100345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113121 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113925 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.132081 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.149487 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.164719 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.175782 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.192832 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216438 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319894 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319921 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319932 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.413382 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.413405 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:32 crc kubenswrapper[4860]: E0320 10:56:32.413562 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:32 crc kubenswrapper[4860]: E0320 10:56:32.413727 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422611 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422680 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422724 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525327 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525383 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525413 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627570 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627580 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627597 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627607 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730330 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730358 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833478 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833496 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833506 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937073 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937131 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937144 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937217 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.963291 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.965399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.965771 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.970136 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.982427 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.001531 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.014020 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.028540 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040833 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040917 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040929 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.041651 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.054060 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.066102 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.090546 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.110225 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.126492 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.138029 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143117 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143128 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.149708 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.161035 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.174753 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.187539 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.200137 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.211900 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.230286 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245735 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245919 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245943 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245955 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.264122 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.274629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.284341 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.295137 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.309321 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.327366 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.344848 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348723 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348781 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348800 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348814 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.358654 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.372869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.412654 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:33 crc kubenswrapper[4860]: E0320 10:56:33.412807 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451148 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451218 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451258 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451280 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451293 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554623 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554726 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554740 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.555309 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.555373 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659126 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659199 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659226 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659253 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761782 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761820 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761847 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761858 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866116 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866176 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866203 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866214 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970096 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970149 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970184 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.977085 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109" exitCode=0 Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.977145 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.016729 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.052642 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.069084 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073559 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073572 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.082504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.098385 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.114817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.127586 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.139969 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.152064 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.169059 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175874 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175910 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175923 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.185139 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.198760 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.212334 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.232067 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278715 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278765 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278781 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278792 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.299807 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s"] Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.300310 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.302148 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.302729 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.313348 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319556 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319588 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319672 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319834 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.325396 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.335217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.346814 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.360975 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.376028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381400 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381436 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381462 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.398391 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.412560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.412684 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:34 crc kubenswrapper[4860]: E0320 10:56:34.412696 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:34 crc kubenswrapper[4860]: E0320 10:56:34.412867 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.415428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420166 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420227 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420299 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.421158 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.421340 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.427919 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.428761 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.439097 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.441975 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.458656 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.475330 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484060 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484081 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484094 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.496578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.514131 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.528028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.586951 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587014 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587044 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.615377 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: W0320 10:56:34.631305 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a19086_4679_4d42_96b8_942e00d8491f.slice/crio-689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2 WatchSource:0}: Error finding container 689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2: Status 404 returned error can't find the container with id 689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2 Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690390 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690419 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690431 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793027 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793123 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898560 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898587 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898641 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.983348 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001904 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001935 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001951 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.041578 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.042325 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.042484 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.060904 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.076080 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.088352 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.102465 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104142 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104153 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104173 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104185 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.122783 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.128435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.128515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.141394 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.164341 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.177897 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.192378 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.206477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207014 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207044 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207053 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207070 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207080 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.222323 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.229860 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.229957 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.230135 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.230318 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:35.730287059 +0000 UTC m=+119.951648027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.237542 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.248784 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.251516 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.267698 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.286057 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310693 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310760 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310834 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.314285 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.412453 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.412593 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414185 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517481 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517537 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517584 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.620963 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621071 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621107 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621133 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725127 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725166 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725191 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725201 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.735069 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.735221 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.735671 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.735651847 +0000 UTC m=+120.957012755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828195 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828259 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828297 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828313 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932487 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932565 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932632 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.993125 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.993288 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.996896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.010806 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.027028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038568 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038630 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038667 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.045089 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.057967 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.078457 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.091504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.110950 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.140538 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.140737 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.140699504 +0000 UTC m=+152.362060402 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.141319 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142376 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142414 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142428 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142452 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142469 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.160575 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.174201 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.187029 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.201370 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.217048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.233521 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241447 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241500 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241526 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241542 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241612 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241627 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241636 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241649 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241675 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241659193 +0000 UTC m=+152.463020091 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241687 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241682313 +0000 UTC m=+152.463043211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241702 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241727 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241718324 +0000 UTC m=+152.463079222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241832 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241880 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241896 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241988 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241962391 +0000 UTC m=+152.463323459 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.244953 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.244984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245034 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.249177 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.261268 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.280675 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.305783 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.323014 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.337982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347963 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347999 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.352808 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.367700 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.383718 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.398782 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.410450 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.412643 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.412775 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.413109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.413190 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.413310 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.413398 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.427853 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.441314 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451539 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451615 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451635 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451679 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.459038 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.474996 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.486823 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.511962 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.526495 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554068 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554102 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647892 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647959 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.648023 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.648046 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.666661 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671815 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671828 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671849 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671864 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.688542 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694404 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694550 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.713012 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718761 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718846 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718902 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.737409 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743166 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743235 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743292 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743355 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.747824 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.747984 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.748033 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:38.748017017 +0000 UTC m=+122.969377915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.760381 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.760503 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762212 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762263 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762302 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.865942 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.865999 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866010 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866042 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969316 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969329 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.004380 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.011385 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" exitCode=1 Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.011429 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.012179 4860 scope.go:117] "RemoveContainer" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.027688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.044985 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.071241 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072037 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072099 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072119 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072129 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:37Z","lastTransitionTime":"2026-03-20T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.086337 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.101392 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.116429 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.130168 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.141368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.156408 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.170741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175526 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175596 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175619 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175649 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:37Z","lastTransitionTime":"2026-03-20T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.188741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.210966 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.225955 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.240967 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.254345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.268907 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.276960 4860 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.412680 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.412837 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.428098 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.443066 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.456581 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.472816 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.498654 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.513627 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.532135 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.554266 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.579449 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.586663 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.601998 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.618048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.633326 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.651166 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.666831 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.682028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.695817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.018100 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.023558 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77"} Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.024156 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.042896 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.055124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.070920 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.084913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.100656 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.115602 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.139076 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.154780 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.168904 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.182567 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.198631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.211142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.225285 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.239600 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.258913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.282056 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.412964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.413003 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.412919 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413157 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413403 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413676 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.768151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.768456 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.768612 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.768581634 +0000 UTC m=+126.989942562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.029216 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.030436 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033799 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" exitCode=1 Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77"} Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033890 4860 scope.go:117] "RemoveContainer" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.035152 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:39 crc kubenswrapper[4860]: E0320 10:56:39.035689 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.055120 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.084857 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.107677 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.143929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.166465 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.188451 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.201483 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.214534 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.230855 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.247256 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.269708 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.285301 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.299856 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.313248 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.328183 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.345277 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.413104 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:39 crc kubenswrapper[4860]: E0320 10:56:39.413301 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.040164 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.045483 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.045786 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.065199 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.094501 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.116821 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.133004 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.149956 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.170259 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.191805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.209030 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.223116 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.238094 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.251430 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.267452 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.282519 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.297191 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.313046 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.334473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412541 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412819 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412666 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412889 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:41 crc kubenswrapper[4860]: I0320 10:56:41.412827 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:41 crc kubenswrapper[4860]: E0320 10:56:41.413051 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412507 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412533 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412533 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.413949 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.414252 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.414355 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.588080 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.643702 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.670541 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.693815 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.710809 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.729597 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.747052 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.767472 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.780665 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.792094 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.802929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.814792 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.814985 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.815064 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:50.815048463 +0000 UTC m=+135.036409361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.817007 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.830455 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.845458 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.859679 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.879371 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.893928 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.905794 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4860]: I0320 10:56:43.412829 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:43 crc kubenswrapper[4860]: E0320 10:56:43.413316 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.412473 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.412943 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.413255 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413697 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413905 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413981 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.426669 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:56:45 crc kubenswrapper[4860]: I0320 10:56:45.412675 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:45 crc kubenswrapper[4860]: E0320 10:56:45.412902 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.412905 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.413002 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.412905 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413124 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413379 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413558 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956827 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956904 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956916 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.973903 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977586 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977627 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977645 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977656 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.990898 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995251 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995265 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995297 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.009545 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013815 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013855 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013893 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.027413 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032289 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032427 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.047139 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.047344 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.413286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.413504 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.439422 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.456012 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.469697 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.483765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.499863 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.518437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.537524 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.554938 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.570393 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.583268 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.588770 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.598109 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.613343 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.627953 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.640979 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.664653 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.681527 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.695055 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412634 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412749 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.412802 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412891 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.413265 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.413472 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:49 crc kubenswrapper[4860]: I0320 10:56:49.413300 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:49 crc kubenswrapper[4860]: E0320 10:56:49.413516 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413347 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413388 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413379 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413507 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413771 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413877 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.909763 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.910023 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.910155 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:06.910128606 +0000 UTC m=+151.131489674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:51 crc kubenswrapper[4860]: I0320 10:56:51.412361 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:51 crc kubenswrapper[4860]: E0320 10:56:51.412536 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.412787 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.413032 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.412799 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.414260 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.414305 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.414527 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.414683 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.590740 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.092253 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.095501 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7"} Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.096090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.108352 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.122663 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.135814 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.147964 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.162001 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.182959 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.209835 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.228403 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.244845 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.256473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.270166 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.281215 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.293793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.308327 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.339477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.356511 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.370488 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.413290 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:53 crc kubenswrapper[4860]: E0320 10:56:53.413451 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.101933 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.102917 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107210 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" exitCode=1 Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7"} Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107333 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.108071 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.108279 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.130841 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.145395 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.157041 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.172250 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.193677 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.218415 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.231589 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.247304 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.261747 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.276050 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.302368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.321744 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.341387 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.372284 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.391197 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.406416 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412855 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412901 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412855 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413090 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413170 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413330 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.426177 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.112636 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.116488 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:56:55 crc kubenswrapper[4860]: E0320 10:56:55.116814 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.140771 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.153893 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.165064 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.180410 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.204461 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.225044 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.241026 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.260275 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.278675 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.291715 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.306470 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.321716 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.336826 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.351688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.382935 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.404453 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.412571 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:55 crc kubenswrapper[4860]: E0320 10:56:55.412718 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.421161 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413087 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413147 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413322 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413502 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413641 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144345 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144590 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144681 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.167630 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173356 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173423 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173448 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173463 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.193051 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197847 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197907 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197919 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197941 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197954 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.209644 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214764 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214780 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.227830 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231982 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.246899 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.247092 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.412470 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.412608 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.425511 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.446813 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.462878 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.474741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.489962 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.508125 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.531100 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.545966 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.560779 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.573141 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.588944 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.592095 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.611145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.629631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.644701 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.658663 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.685486 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.721834 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412739 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412776 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.412953 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.413275 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.413714 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:59 crc kubenswrapper[4860]: I0320 10:56:59.412900 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:59 crc kubenswrapper[4860]: E0320 10:56:59.413171 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.412950 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.413152 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.413152 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.413846 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.414039 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.414294 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:01 crc kubenswrapper[4860]: I0320 10:57:01.413367 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:01 crc kubenswrapper[4860]: E0320 10:57:01.413628 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.412906 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412686 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.413533 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.413625 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.594014 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:03 crc kubenswrapper[4860]: I0320 10:57:03.412857 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:03 crc kubenswrapper[4860]: E0320 10:57:03.413133 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413108 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413290 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413737 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413786 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413824 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413865 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:05 crc kubenswrapper[4860]: I0320 10:57:05.413163 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:05 crc kubenswrapper[4860]: E0320 10:57:05.413406 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413174 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413196 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413693 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413860 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413920 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.008550 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.008859 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.008985 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:39.008948025 +0000 UTC m=+183.230308963 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.412736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.412965 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.430201 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.446991 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.466539 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.492187 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.514196 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.542647 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.564142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.584437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.594844 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.602793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622686 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.623578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.636510 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641418 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641481 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641009 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.655945 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.657254 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659570 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659633 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659657 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659670 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.672172 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.673984 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679192 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679203 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679238 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679254 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.693432 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697665 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697701 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.702145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.709530 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.709678 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.717390 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.733951 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.750655 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.220116 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.220305 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.220280009 +0000 UTC m=+216.441640907 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322075 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322149 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322177 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322205 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322379 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322409 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322455 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322431324 +0000 UTC m=+216.543792242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322533 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322503856 +0000 UTC m=+216.543864764 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322675 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322691 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322705 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322740 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322730833 +0000 UTC m=+216.544091741 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323070 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323097 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323109 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323153 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.323139194 +0000 UTC m=+216.544500102 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.413287 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.413349 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.413703 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.414010 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.414193 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.415270 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:09 crc kubenswrapper[4860]: I0320 10:57:09.413593 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:09 crc kubenswrapper[4860]: E0320 10:57:09.414555 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:09 crc kubenswrapper[4860]: I0320 10:57:09.415287 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:09 crc kubenswrapper[4860]: E0320 10:57:09.415745 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413394 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413452 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413560 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413735 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413772 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:11 crc kubenswrapper[4860]: I0320 10:57:11.413312 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:11 crc kubenswrapper[4860]: E0320 10:57:11.413596 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189016 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189123 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" exitCode=1 Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311"} Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189779 4860 scope.go:117] "RemoveContainer" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.210733 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.250117 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.268806 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.284610 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.299185 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.314260 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.331909 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.347253 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.364930 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.380345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.392740 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.405300 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.412728 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.412874 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.413076 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.413152 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.413333 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.413425 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.419012 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.432650 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.446185 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.459781 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.482707 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.596726 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.197053 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.197151 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.215770 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.237622 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.258406 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.283086 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.311948 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.335668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.357640 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.374048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.392174 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.409504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.412888 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:13 crc kubenswrapper[4860]: E0320 10:57:13.413042 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.432103 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.449011 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.475243 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.496371 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.512707 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.529071 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.544592 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.412941 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.412991 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.413029 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413165 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413454 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413487 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:15 crc kubenswrapper[4860]: I0320 10:57:15.413419 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:15 crc kubenswrapper[4860]: E0320 10:57:15.413644 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:15 crc kubenswrapper[4860]: I0320 10:57:15.425432 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412743 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412822 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.413540 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.413853 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.414077 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.413215 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.413543 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.437265 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.464893 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.482734 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.523504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.541866 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.560127 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.578680 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.591386 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.597946 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.609770 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.628712 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.643558 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.656902 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.677291 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.691124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.704399 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.719983 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.740292 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.758572 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889141 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889215 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889275 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889339 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889359 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.910717 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.916002 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.916014 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.938417 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942873 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942885 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.957655 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963895 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963935 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963946 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963980 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.986822 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992763 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992788 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992802 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.007633 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.007802 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.412802 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.413568 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413637 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.413675 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413785 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:19 crc kubenswrapper[4860]: I0320 10:57:19.413735 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:19 crc kubenswrapper[4860]: E0320 10:57:19.413883 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.412787 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413107 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.413166 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.413214 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413403 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413610 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:21 crc kubenswrapper[4860]: I0320 10:57:21.413371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:21 crc kubenswrapper[4860]: E0320 10:57:21.413533 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413368 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413473 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413535 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413565 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413664 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413797 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.414953 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.599615 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.249029 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.253656 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.254214 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.273948 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.296765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.311216 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.323765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.335438 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.348855 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.362986 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.372869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.399207 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.412914 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.413028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: E0320 10:57:23.413204 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.427152 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.430068 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.440173 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.454036 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.466428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.480302 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.494473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.512119 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.526915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.259777 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.260768 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264330 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" exitCode=1 Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264466 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.265159 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.265609 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.283915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.310040 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.329468 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.342890 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.357336 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.374755 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.399982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413115 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413138 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413299 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413442 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413599 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413949 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.432418 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8e33372-ff64-42f9-a28d-a1a292559759\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2624c09c9ed0c88340cb5c33a9f304b84e7f10b768178bc03980768edd770b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:10.245671 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:10.247467 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:10.249168 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:10.250405 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:39.912190 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:39.912319 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc7f588e4fdfa1c92fcf94e685925ef7708d48f6dc4a72363331f66f0b4ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b411b2d78ae0ca6e465eafe2ca565d78630979ffc93ff9fb0785c70d42e4c447\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.448817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.465554 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.477115 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.492191 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.513506 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.527803 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.271309 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.277811 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:25 crc kubenswrapper[4860]: E0320 10:57:25.278043 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.299183 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" podStartSLOduration=96.299142452 podStartE2EDuration="1m36.299142452s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:24.629626306 +0000 UTC m=+168.850987204" watchObservedRunningTime="2026-03-20 10:57:25.299142452 +0000 UTC m=+169.520503390" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.322537 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-srbpg" podStartSLOduration=97.322498143 podStartE2EDuration="1m37.322498143s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.321686401 +0000 UTC m=+169.543047379" watchObservedRunningTime="2026-03-20 10:57:25.322498143 +0000 UTC m=+169.543859081" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.337514 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podStartSLOduration=97.337476875 podStartE2EDuration="1m37.337476875s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.337206197 +0000 UTC m=+169.558567135" watchObservedRunningTime="2026-03-20 10:57:25.337476875 +0000 UTC m=+169.558837783" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.356294 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmc44" podStartSLOduration=97.356268681 podStartE2EDuration="1m37.356268681s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.356075715 +0000 UTC m=+169.577436653" watchObservedRunningTime="2026-03-20 10:57:25.356268681 +0000 UTC m=+169.577629579" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.391406 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" podStartSLOduration=97.391376225 podStartE2EDuration="1m37.391376225s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.391104177 +0000 UTC m=+169.612465115" watchObservedRunningTime="2026-03-20 10:57:25.391376225 +0000 UTC m=+169.612737143" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.409666 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.409642496 podStartE2EDuration="10.409642496s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.408917997 +0000 UTC m=+169.630278915" watchObservedRunningTime="2026-03-20 10:57:25.409642496 +0000 UTC m=+169.631003404" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.413384 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:25 crc kubenswrapper[4860]: E0320 10:57:25.413582 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.448957 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.448929425 podStartE2EDuration="1m14.448929425s" podCreationTimestamp="2026-03-20 10:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.448490983 +0000 UTC m=+169.669851901" watchObservedRunningTime="2026-03-20 10:57:25.448929425 +0000 UTC m=+169.670290333" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.498594 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tggrc" podStartSLOduration=97.498567698 podStartE2EDuration="1m37.498567698s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.486856177 +0000 UTC m=+169.708217065" watchObservedRunningTime="2026-03-20 10:57:25.498567698 +0000 UTC m=+169.719928596" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.527930 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.527887634 podStartE2EDuration="41.527887634s" podCreationTimestamp="2026-03-20 10:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.526991879 +0000 UTC m=+169.748352787" watchObservedRunningTime="2026-03-20 10:57:25.527887634 +0000 UTC m=+169.749248532" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.528264 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.528255954 podStartE2EDuration="2.528255954s" podCreationTimestamp="2026-03-20 10:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.512266525 +0000 UTC m=+169.733627423" watchObservedRunningTime="2026-03-20 10:57:25.528255954 +0000 UTC m=+169.749616852" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.412967 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.413119 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.413004 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413294 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413361 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413509 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:27 crc kubenswrapper[4860]: I0320 10:57:27.413021 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:27 crc kubenswrapper[4860]: E0320 10:57:27.414836 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:27 crc kubenswrapper[4860]: E0320 10:57:27.600319 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036727 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036848 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:28Z","lastTransitionTime":"2026-03-20T10:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.096185 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.096144783 podStartE2EDuration="1m21.096144783s" podCreationTimestamp="2026-03-20 10:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.624336752 +0000 UTC m=+169.845697660" watchObservedRunningTime="2026-03-20 10:57:28.096144783 +0000 UTC m=+172.317505701" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.098785 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8"] Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.099203 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101996 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101923 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.102463 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178524 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178637 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178807 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178861 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.280546 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281103 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281456 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281726 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.282004 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281620 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.282137 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.283182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.293018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.305795 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.399784 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.411964 4860 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.412896 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.413040 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.413086 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.413415 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.414022 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.414267 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.420123 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: W0320 10:57:28.437376 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc11373f_096b_4cc4_810b_f702f819da6c.slice/crio-02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27 WatchSource:0}: Error finding container 02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27: Status 404 returned error can't find the container with id 02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27 Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.293736 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" event={"ID":"dc11373f-096b-4cc4-810b-f702f819da6c","Type":"ContainerStarted","Data":"d635ee52acf225fb2d2f9567d5edbb00d2b2029902ed3a38b2251701a9c0be82"} Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.293808 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" event={"ID":"dc11373f-096b-4cc4-810b-f702f819da6c","Type":"ContainerStarted","Data":"02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27"} Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.315408 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" podStartSLOduration=101.315385745 podStartE2EDuration="1m41.315385745s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:29.314613054 +0000 UTC m=+173.535973972" watchObservedRunningTime="2026-03-20 10:57:29.315385745 +0000 UTC m=+173.536746653" Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.413311 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:29 crc kubenswrapper[4860]: E0320 10:57:29.413476 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413140 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413381 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413327 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413515 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413708 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:31 crc kubenswrapper[4860]: I0320 10:57:31.413335 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:31 crc kubenswrapper[4860]: E0320 10:57:31.414098 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412417 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412417 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412447 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413007 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413354 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413773 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.601604 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:33 crc kubenswrapper[4860]: I0320 10:57:33.412689 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:33 crc kubenswrapper[4860]: E0320 10:57:33.413196 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414565 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414715 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414806 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414880 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:35 crc kubenswrapper[4860]: I0320 10:57:35.413200 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:35 crc kubenswrapper[4860]: E0320 10:57:35.413460 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412661 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412780 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.412813 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.412969 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.413126 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:37 crc kubenswrapper[4860]: I0320 10:57:37.413172 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:37 crc kubenswrapper[4860]: E0320 10:57:37.413977 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:37 crc kubenswrapper[4860]: E0320 10:57:37.602308 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413401 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413335 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413543 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413674 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413812 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:39 crc kubenswrapper[4860]: I0320 10:57:39.109723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.110016 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.110155 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.110122105 +0000 UTC m=+247.331483013 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:39 crc kubenswrapper[4860]: I0320 10:57:39.413047 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.413367 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412745 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412823 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412901 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.412896 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.412955 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.413353 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.413727 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.414078 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:41 crc kubenswrapper[4860]: I0320 10:57:41.412980 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:41 crc kubenswrapper[4860]: E0320 10:57:41.413378 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412580 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412705 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412886 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412914 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.603547 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:43 crc kubenswrapper[4860]: I0320 10:57:43.412469 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:43 crc kubenswrapper[4860]: E0320 10:57:43.413002 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.412479 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.412618 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.412755 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.413008 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.413197 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.413362 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:45 crc kubenswrapper[4860]: I0320 10:57:45.412904 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:45 crc kubenswrapper[4860]: E0320 10:57:45.413150 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413189 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413189 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413460 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413249 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413624 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413782 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:47 crc kubenswrapper[4860]: I0320 10:57:47.412725 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:47 crc kubenswrapper[4860]: E0320 10:57:47.413925 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:47 crc kubenswrapper[4860]: E0320 10:57:47.604248 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413362 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413362 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.413613 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413395 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.413872 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.414092 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:49 crc kubenswrapper[4860]: I0320 10:57:49.412863 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:49 crc kubenswrapper[4860]: E0320 10:57:49.413447 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413054 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413141 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413177 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413244 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413400 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413514 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:51 crc kubenswrapper[4860]: I0320 10:57:51.412692 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:51 crc kubenswrapper[4860]: E0320 10:57:51.413081 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413131 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413175 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413270 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413428 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413694 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413876 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.606078 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:53 crc kubenswrapper[4860]: I0320 10:57:53.413512 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:53 crc kubenswrapper[4860]: E0320 10:57:53.414127 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:53 crc kubenswrapper[4860]: I0320 10:57:53.414543 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:53 crc kubenswrapper[4860]: E0320 10:57:53.414763 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412382 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412489 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412410 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.412680 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.413071 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.413210 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:55 crc kubenswrapper[4860]: I0320 10:57:55.412962 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:55 crc kubenswrapper[4860]: E0320 10:57:55.413176 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412622 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412762 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412615 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.412919 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.413160 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.413389 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:57 crc kubenswrapper[4860]: I0320 10:57:57.413017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:57 crc kubenswrapper[4860]: E0320 10:57:57.415273 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:57 crc kubenswrapper[4860]: E0320 10:57:57.606968 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413024 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413024 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413119 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414206 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414343 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414613 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.415618 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416558 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416635 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" exitCode=1 Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416682 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416734 4860 scope.go:117] "RemoveContainer" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.417435 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.417803 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cmc44_openshift-multus(a89c8af2-338f-401f-aad5-c6d7763a3b3a)\"" pod="openshift-multus/multus-cmc44" podUID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" Mar 20 10:57:59 crc kubenswrapper[4860]: I0320 10:57:59.413178 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:59 crc kubenswrapper[4860]: E0320 10:57:59.413510 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:59 crc kubenswrapper[4860]: I0320 10:57:59.422062 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.412915 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.413030 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.412948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413114 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413260 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413586 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:01 crc kubenswrapper[4860]: I0320 10:58:01.413202 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:01 crc kubenswrapper[4860]: E0320 10:58:01.413414 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413453 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413554 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413613 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413732 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413796 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413911 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.608825 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:03 crc kubenswrapper[4860]: I0320 10:58:03.412843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:03 crc kubenswrapper[4860]: E0320 10:58:03.413034 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412644 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412712 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.412916 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.413692 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.413853 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.414334 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.344912 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.345555 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:05 crc kubenswrapper[4860]: E0320 10:58:05.346011 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.413532 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:05 crc kubenswrapper[4860]: E0320 10:58:05.414459 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.445742 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.448132 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.449744 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:58:06 crc kubenswrapper[4860]: I0320 10:58:06.412441 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:06 crc kubenswrapper[4860]: I0320 10:58:06.412595 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:06 crc kubenswrapper[4860]: E0320 10:58:06.413017 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:06 crc kubenswrapper[4860]: E0320 10:58:06.413128 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:07 crc kubenswrapper[4860]: I0320 10:58:07.413165 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:07 crc kubenswrapper[4860]: I0320 10:58:07.413204 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.414572 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.414832 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.609690 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:08 crc kubenswrapper[4860]: I0320 10:58:08.413169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:08 crc kubenswrapper[4860]: E0320 10:58:08.413446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:08 crc kubenswrapper[4860]: I0320 10:58:08.413529 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:08 crc kubenswrapper[4860]: E0320 10:58:08.413842 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:09 crc kubenswrapper[4860]: I0320 10:58:09.413066 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:09 crc kubenswrapper[4860]: E0320 10:58:09.413421 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:09 crc kubenswrapper[4860]: I0320 10:58:09.413828 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:09 crc kubenswrapper[4860]: E0320 10:58:09.413967 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:10 crc kubenswrapper[4860]: I0320 10:58:10.413077 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:10 crc kubenswrapper[4860]: I0320 10:58:10.413276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:10 crc kubenswrapper[4860]: E0320 10:58:10.413301 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:10 crc kubenswrapper[4860]: E0320 10:58:10.413444 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:11 crc kubenswrapper[4860]: I0320 10:58:11.413540 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:11 crc kubenswrapper[4860]: I0320 10:58:11.413631 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:11 crc kubenswrapper[4860]: E0320 10:58:11.413697 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:11 crc kubenswrapper[4860]: E0320 10:58:11.413822 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.321501 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.321667 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.321632297 +0000 UTC m=+338.542993215 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.413062 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.413144 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.413338 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.413511 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.422912 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.422979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.423028 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.423102 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423181 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423273 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423184 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423331 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423330 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423367 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423392 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423296 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423397 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423372117 +0000 UTC m=+338.644733005 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423481 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423454429 +0000 UTC m=+338.644815367 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423524551 +0000 UTC m=+338.644885489 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423564 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423550562 +0000 UTC m=+338.644911510 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.611875 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412568 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412604 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:13 crc kubenswrapper[4860]: E0320 10:58:13.412698 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:13 crc kubenswrapper[4860]: E0320 10:58:13.412818 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412960 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.436032 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podStartSLOduration=145.436009945 podStartE2EDuration="2m25.436009945s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:05.478536905 +0000 UTC m=+209.699897813" watchObservedRunningTime="2026-03-20 10:58:13.436009945 +0000 UTC m=+217.657370843" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.412985 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:14 crc kubenswrapper[4860]: E0320 10:58:14.413145 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.413349 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:14 crc kubenswrapper[4860]: E0320 10:58:14.413421 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.487906 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.487977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764"} Mar 20 10:58:15 crc kubenswrapper[4860]: I0320 10:58:15.412929 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:15 crc kubenswrapper[4860]: I0320 10:58:15.412952 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:15 crc kubenswrapper[4860]: E0320 10:58:15.413084 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:15 crc kubenswrapper[4860]: E0320 10:58:15.413323 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:16 crc kubenswrapper[4860]: I0320 10:58:16.412622 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:16 crc kubenswrapper[4860]: E0320 10:58:16.412802 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:16 crc kubenswrapper[4860]: I0320 10:58:16.413017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:16 crc kubenswrapper[4860]: E0320 10:58:16.413290 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:17 crc kubenswrapper[4860]: I0320 10:58:17.412633 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:17 crc kubenswrapper[4860]: I0320 10:58:17.412765 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:17 crc kubenswrapper[4860]: E0320 10:58:17.415194 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:17 crc kubenswrapper[4860]: E0320 10:58:17.415496 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.412429 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.412580 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415126 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415750 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415913 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.416901 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.413120 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.413600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.416491 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.417120 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.585813 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.625203 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.625791 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.626518 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.627191 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.629368 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.629781 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.630572 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.631090 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.634871 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.635048 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.640761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.641939 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.642446 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.642722 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643041 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643301 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643983 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644134 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644297 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644397 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644494 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.646881 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647094 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647246 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647393 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647613 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647760 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647949 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648128 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648784 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648940 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.649128 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.650825 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.651420 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.651928 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.652687 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.655268 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656058 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656465 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656996 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.657570 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.657953 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.658425 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.658917 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.659360 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.659936 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.661333 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.662046 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.671298 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.671961 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.681442 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.681980 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.682147 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.682518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.688345 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.688703 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.689428 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.694741 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.708459 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.709102 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.709947 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.711378 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712199 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712486 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712625 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712731 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712813 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712893 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712926 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713022 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713080 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713100 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713238 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713411 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713554 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712736 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713654 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714043 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714242 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714352 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714565 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714741 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714888 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715076 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715167 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715255 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715696 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719605 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719644 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719689 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719709 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719749 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719780 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719799 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719816 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719835 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719855 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719872 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719904 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719924 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719942 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719969 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720104 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720138 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720361 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720522 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720591 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720648 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720688 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720767 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720784 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720867 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720879 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720873 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720964 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720979 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721111 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721991 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721116 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721162 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.722551 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.722837 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.723506 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.723863 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5gdgj"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.724604 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725023 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.724994 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725124 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725456 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725466 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725618 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725650 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725813 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725865 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725952 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725881 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.726486 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.726677 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.727163 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.727870 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.735766 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.736849 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.737397 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.737553 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.738022 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.747731 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759037 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759058 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759123 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759138 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.769962 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.771586 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772021 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772024 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772152 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772614 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.773475 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.773734 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774021 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774357 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774452 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774873 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.782490 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.783129 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.783674 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.784543 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.785340 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.784558 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.792444 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.793464 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.794600 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.794809 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.795788 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.800693 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.801429 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.801678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.803391 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.803483 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.808798 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.815803 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.816211 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.817486 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.817878 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818095 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818402 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818761 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819343 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819883 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820025 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820776 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820875 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820954 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821036 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821249 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821360 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821475 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821617 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821727 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821846 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821968 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822089 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822342 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822465 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822571 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822792 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823124 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823182 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823218 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823304 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823380 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823413 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823448 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823493 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823523 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823591 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823619 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823639 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823650 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823685 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824095 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824132 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824265 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824430 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824589 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824633 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824659 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824703 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824790 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824815 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824837 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824883 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825018 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825050 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825162 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825289 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825319 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825351 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825437 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825486 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825522 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825604 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826057 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826208 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826636 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826691 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826732 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826766 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826796 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826844 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826876 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826908 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826935 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826965 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827142 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827194 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827245 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827274 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827502 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827757 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827819 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827897 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828140 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828167 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828214 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828257 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828443 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828552 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828956 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829028 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829049 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829071 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829088 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829104 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829444 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829659 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829678 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829697 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829721 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829740 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829925 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829934 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829981 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.830032 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.830051 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.831511 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.832190 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.835852 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.836823 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.840930 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.842520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.843985 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.846018 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.847129 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.848632 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.849635 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.850141 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.851140 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.852275 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.854719 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.855146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.855337 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.859651 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.859694 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.865976 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.866139 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.875384 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.877343 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.879358 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.881477 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.884021 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.888690 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.888767 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.889753 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.891203 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.892050 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.893628 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.894139 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.895170 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.896412 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.896531 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.897471 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.898482 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x4xrf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.899092 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.899771 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.901261 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.902006 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.903119 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.904339 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.906174 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.906997 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.908188 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.909379 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.910988 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.911074 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.912301 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.913717 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.914872 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.916320 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.917073 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.918337 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.919257 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.928344 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931266 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931349 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931460 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931621 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931842 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.932012 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.932941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933119 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933180 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933217 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933283 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933344 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933376 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933438 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933469 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933534 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933591 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933618 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933658 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933680 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933752 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933862 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933930 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934006 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934092 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934098 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934119 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934257 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934354 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934386 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934417 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934452 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934480 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934512 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934554 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934583 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934609 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934679 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934711 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934746 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934998 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935023 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935092 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935155 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935252 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935309 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935336 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935392 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935495 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935553 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935665 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935701 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935757 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935789 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935819 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935848 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935909 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935938 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935967 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935975 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935992 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936054 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936157 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936242 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936340 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936467 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936500 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936575 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936645 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936689 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936758 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936820 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936892 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936909 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936933 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936998 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937104 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937171 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937253 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937326 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937389 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939625 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939864 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939912 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.941209 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.941595 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942269 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942992 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943515 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943805 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944148 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944206 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944313 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944330 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944719 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944746 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944879 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945450 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945585 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945614 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945817 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945907 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945972 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945996 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946069 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946324 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946697 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946832 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946902 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946918 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947376 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947866 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.948390 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.948467 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949774 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949848 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950522 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950644 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951100 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951311 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951421 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952211 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952595 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.953636 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.954182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.956624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.968041 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.981039 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.989105 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.008077 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.036982 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047692 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047735 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047784 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048321 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048404 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048441 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048497 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048511 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049286 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049428 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049326 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049984 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050078 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050160 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050197 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050534 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050792 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050849 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.051162 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.051274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.058439 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.068121 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.073797 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.088383 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.090146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.108219 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.129025 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.148506 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.168805 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.187942 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.215929 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.222163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.228791 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.236289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.248988 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.268357 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.288188 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.307309 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.328423 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.348297 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.368403 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.389882 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.409806 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.429683 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.448806 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.468925 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.489047 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.504981 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.508445 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.529660 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.549559 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.568725 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.590150 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.608871 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.628817 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.648028 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.668184 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.688678 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.695755 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.708454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.710638 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.729508 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.735821 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.749527 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.768535 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.789153 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.808136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.826026 4860 request.go:700] Waited for 1.009372347s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.827320 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.848934 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.869203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.888752 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.909035 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.929132 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.949156 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.968861 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.988486 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.989630 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.008520 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.014155 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.029261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.047870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.048969 4860 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049137 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549113985 +0000 UTC m=+225.770474893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.048968 4860 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049422 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert podName:8fe93f79-239c-4b6a-bd22-bbdf55aff0af nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549408973 +0000 UTC m=+225.770769881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert") pod "catalog-operator-68c6474976-gf5nr" (UID: "8fe93f79-239c-4b6a-bd22-bbdf55aff0af") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049032 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049604 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549593418 +0000 UTC m=+225.770954326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049301 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549808634 +0000 UTC m=+225.771169542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052504 4860 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052571 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052739 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert podName:8fe93f79-239c-4b6a-bd22-bbdf55aff0af nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.552723995 +0000 UTC m=+225.774084903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert") pod "catalog-operator-68c6474976-gf5nr" (UID: "8fe93f79-239c-4b6a-bd22-bbdf55aff0af") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052871 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.552853109 +0000 UTC m=+225.774214017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.068170 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.088479 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.108948 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.129288 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.149938 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.169704 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.217220 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.229929 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.233948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.280672 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.283848 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.288167 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.323549 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.330593 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.348321 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.368518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.388202 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.408936 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.428765 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.448583 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.458657 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.475567 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.480521 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.483018 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.488257 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.509559 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.526460 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.551246 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.570113 4860 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581507 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581631 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581688 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581720 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581737 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581771 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.583654 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.584108 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.584196 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.586892 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.586983 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.587469 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.589192 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.628890 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.648193 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.669697 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.688761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.692177 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.708308 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.726878 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.735387 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.750171 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.769265 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.775253 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:21 crc kubenswrapper[4860]: W0320 10:58:21.787061 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa87f04_40c6_4575_b647_fb13a115b81d.slice/crio-5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27 WatchSource:0}: Error finding container 5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27: Status 404 returned error can't find the container with id 5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27 Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.787845 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.809258 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.827014 4860 request.go:700] Waited for 1.894313224s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.850037 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.870520 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.887294 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.904262 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.923422 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.927078 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.942816 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.965377 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.967456 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:21 crc kubenswrapper[4860]: W0320 10:58:21.978281 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef8eec8_b86d_4f5a_931e_c76e11c07f94.slice/crio-2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2 WatchSource:0}: Error finding container 2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2: Status 404 returned error can't find the container with id 2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2 Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.989019 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.996747 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.004187 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.004440 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.019992 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.023951 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.032939 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.050364 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.066288 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.089763 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:22 crc kubenswrapper[4860]: W0320 10:58:22.095947 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd582dc3e_7510_42be_aa3a_1d15b35c327c.slice/crio-4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d WatchSource:0}: Error finding container 4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d: Status 404 returned error can't find the container with id 4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.112907 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.126742 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.133665 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.141484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.150983 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.169909 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.174816 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.186780 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.194012 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.200772 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.217542 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.218520 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.240600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.256464 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.259554 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.267021 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.267603 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.273605 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.282856 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.290983 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.314334 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.317603 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.322415 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.346947 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.348570 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.348644 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.352031 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.360929 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.375017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.387571 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395615 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395673 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395699 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395770 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395804 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395823 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395845 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.396466 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:22.896445783 +0000 UTC m=+227.117806681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395915 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397742 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397763 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397808 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398081 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398111 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398130 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398150 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398194 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398962 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399002 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399019 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399037 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399054 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399110 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399137 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399205 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400443 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400467 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400753 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400800 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400822 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400899 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401438 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401476 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401502 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401537 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401559 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401648 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401806 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401851 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.414045 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.422562 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.427674 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.437100 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.481634 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.484078 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.502785 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503829 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503885 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503904 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503936 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503961 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503980 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504014 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504033 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504063 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504162 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504200 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504279 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504303 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504324 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504357 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504384 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504438 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504495 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504538 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504556 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504574 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504609 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504651 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504682 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504699 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504730 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504820 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504862 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504903 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504917 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504933 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504962 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504980 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505036 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505077 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505145 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505187 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505360 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505383 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505402 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505476 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505497 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505536 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505555 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.506200 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.506539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.006512743 +0000 UTC m=+227.227873641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.513938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.514514 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.516170 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.516201 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.517928 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518101 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518135 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518443 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519206 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519462 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519522 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.520908 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.520959 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.524793 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.524991 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525171 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525395 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525980 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.526545 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527411 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" event={"ID":"a58a54d3-d454-4503-8b70-0e78784efdfc","Type":"ContainerStarted","Data":"d7f22518b520d9c10d2d73fa5144d12a4e95e49e74395d5b60a03b895d44f408"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527466 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" event={"ID":"a58a54d3-d454-4503-8b70-0e78784efdfc","Type":"ContainerStarted","Data":"8b9a8cb26cb885997ef61249d758bafd09ddc862da19b4fd6d054c4c5141458c"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527718 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.528120 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.528504 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.529002 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531078 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gdgj" event={"ID":"d582dc3e-7510-42be-aa3a-1d15b35c327c","Type":"ContainerStarted","Data":"955bb582cec92cd2ecd2c40b586780e0c9b8fdf6f046be17de2e9d5aa6119a42"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531114 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gdgj" event={"ID":"d582dc3e-7510-42be-aa3a-1d15b35c327c","Type":"ContainerStarted","Data":"4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531381 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.532398 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"150a5762d679342e63f55b6c6648f93ca6c3f2fda48ad4b9322b94704a226cc2"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.533167 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"407ff7330deb9e9b1f7b10afba7128995bbec481b413ddbaec60802a65dd70f3"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.533934 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"96f6b004bfae066c2c561b1c3a0e043026c1989e2006e7f31f8c168092822802"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.534078 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.535872 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" event={"ID":"ba450d5b-c962-4788-8215-d1eb12f9b314","Type":"ContainerStarted","Data":"82555483b3fc53c5b884a2ecae2f37dcfafce8a387c4121ff7f51f3a61df43ac"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.535892 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" event={"ID":"ba450d5b-c962-4788-8215-d1eb12f9b314","Type":"ContainerStarted","Data":"04e53020d08b82eb6c1040a79b1ef2d1751e0e3d0b8b6ccf20f05e84232b8159"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.540495 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerStarted","Data":"d5905e478a83932c595496c691eb2aa9ded98c6cc8b295a165e6b198c1a83626"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.548868 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerStarted","Data":"80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.548908 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerStarted","Data":"2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.549438 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.551322 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerStarted","Data":"a1580a457002eb0c992197304d7aa1c99c6001d60b87a1e21dc5b0c8a7c76848"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552108 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552150 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552957 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" event={"ID":"cfa87f04-40c6-4575-b647-fb13a115b81d","Type":"ContainerStarted","Data":"2a65c276eca2a8f4efd1f7a4c2c80d5c458f8825408efa4c709865822d2e34d3"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.553001 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" event={"ID":"cfa87f04-40c6-4575-b647-fb13a115b81d","Type":"ContainerStarted","Data":"5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.556574 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.562980 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.565314 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608146 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608248 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608284 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608322 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608381 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608442 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608463 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608482 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608613 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608650 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.609156 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.610388 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.610549 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611026 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611261 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.615726 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.616483 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.116459942 +0000 UTC m=+227.337820850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.618391 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.619246 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.620704 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.626938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.632217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.663159 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.673701 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.693309 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.705756 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.710162 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.710401 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.210362344 +0000 UTC m=+227.431723242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.711377 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.712260 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.212215945 +0000 UTC m=+227.433576833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.714042 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.718747 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.737163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.744776 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.756106 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.756538 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.769023 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.774108 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.793232 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.805304 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.809760 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.810203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.813081 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.813601 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.814358 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.814957 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.314937573 +0000 UTC m=+227.536298471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.820282 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.875164 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.889531 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.899386 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.916557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.918660 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.418635937 +0000 UTC m=+227.639996835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.930458 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.937652 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.018901 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.019418 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.51939831 +0000 UTC m=+227.740759218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.019606 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.020197 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.020977 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.021250 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.023673 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.023726 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.068133 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.124888 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.125778 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.625764039 +0000 UTC m=+227.847124927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.136345 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.167276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.185420 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.200801 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.233383 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.233933 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.733910807 +0000 UTC m=+227.955271705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.335210 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.335829 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.335962 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.835935155 +0000 UTC m=+228.057296053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.436345 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.436835 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.936814311 +0000 UTC m=+228.158175209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.545370 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.546855 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.046827861 +0000 UTC m=+228.268188759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.574522 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" podStartSLOduration=155.574491471 podStartE2EDuration="2m35.574491471s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:23.573183864 +0000 UTC m=+227.794544762" watchObservedRunningTime="2026-03-20 10:58:23.574491471 +0000 UTC m=+227.795852369" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.585691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"d15e97a481a2866411fa098f353115fd974d34dab62e8e9911124e820ac31078"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.598148 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerStarted","Data":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.598491 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.601962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerStarted","Data":"366c71d2561bff010f4d5dff91d7764636b34e8d53c1f0235c50a2b7eb65710b"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.616824 4860 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-nxq82 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.616909 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.624597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"8e3bcc81cdd00a709affe27df968c6c77cd3b334be5615abb138b11884ded3f7"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.626611 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" event={"ID":"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e","Type":"ContainerStarted","Data":"e69053991352de55130197ac65f86ee443a634f8c6f03ab8a7b0e76ccb89ebff"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.631502 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"7f86054b01bd27d3967146848a350f1aec6609eb38c33d089881ea7c7eef77f3"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.632645 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4xrf" event={"ID":"37b4c0fc-6a82-4f6b-85fc-233090358f9c","Type":"ContainerStarted","Data":"3bfc86a783d03b8665695deb2c7a8ebbef5d8f64d921b3278190db7abb1fcc2b"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.637336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.637399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"ea92a36da340039fb1ce36999b468209253b73004a017fa10428756510dbd011"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.646788 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"728f7b7c66988558b1d883323799eddbefdffa3dccc36021b7936ebb56e4353c"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.647067 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.647976 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.147955754 +0000 UTC m=+228.369316642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.659403 4860 generic.go:334] "Generic (PLEG): container finished" podID="0eff7ea5-251b-44de-b129-c604349d6e6c" containerID="bf29f5e88646849ccbf1ef41a523006e5d0e1517aafc4bfd201c3385a8c598bb" exitCode=0 Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.659548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerDied","Data":"bf29f5e88646849ccbf1ef41a523006e5d0e1517aafc4bfd201c3385a8c598bb"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.670548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" event={"ID":"d62cba0d-d390-4638-aa42-59631e4bf118","Type":"ContainerStarted","Data":"44721ad3dbc20a124e5ccbe9787803f6ec0cc51a6aecb26e2a19230ed22c5b87"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.670610 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" event={"ID":"d62cba0d-d390-4638-aa42-59631e4bf118","Type":"ContainerStarted","Data":"c4c3728cb0ff75517d60e0661b517c4cc605807cf199fa6e242c96682e23cf56"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.671371 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.671467 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.737289 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" podStartSLOduration=155.737256468 podStartE2EDuration="2m35.737256468s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:23.724917975 +0000 UTC m=+227.946278863" watchObservedRunningTime="2026-03-20 10:58:23.737256468 +0000 UTC m=+227.958617366" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.749994 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.753300 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.253284564 +0000 UTC m=+228.474645472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.860989 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.861483 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.361449463 +0000 UTC m=+228.582810361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.868291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.868753 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.368734836 +0000 UTC m=+228.590095734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.978152 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.978510 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.478453067 +0000 UTC m=+228.699813965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.978617 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.979425 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.479414014 +0000 UTC m=+228.700774912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.086060 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:24 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:24 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:24 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.086817 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.088063 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.088566 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.588530809 +0000 UTC m=+228.809891707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.158727 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5gdgj" podStartSLOduration=156.158696111 podStartE2EDuration="2m36.158696111s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.158567948 +0000 UTC m=+228.379928866" watchObservedRunningTime="2026-03-20 10:58:24.158696111 +0000 UTC m=+228.380057009" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.189995 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.190449 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.690429654 +0000 UTC m=+228.911790552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.297544 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.297956 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.797937504 +0000 UTC m=+229.019298402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.399487 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.399961 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.899940392 +0000 UTC m=+229.121301290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.428442 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.464292 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podStartSLOduration=156.464271641 podStartE2EDuration="2m36.464271641s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.463008396 +0000 UTC m=+228.684369294" watchObservedRunningTime="2026-03-20 10:58:24.464271641 +0000 UTC m=+228.685632539" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.501083 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.501736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.001694122 +0000 UTC m=+229.223055030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.510853 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.527406 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.529853 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" podStartSLOduration=156.529817374 podStartE2EDuration="2m36.529817374s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.501504187 +0000 UTC m=+228.722865085" watchObservedRunningTime="2026-03-20 10:58:24.529817374 +0000 UTC m=+228.751178282" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.537804 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.541367 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.584439 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.603491 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.603963 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.103947286 +0000 UTC m=+229.325308184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.671403 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podStartSLOduration=155.671380642 podStartE2EDuration="2m35.671380642s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.628349485 +0000 UTC m=+228.849710403" watchObservedRunningTime="2026-03-20 10:58:24.671380642 +0000 UTC m=+228.892741540" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.673634 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.715157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.715843 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.215825709 +0000 UTC m=+229.437186607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.716926 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" podStartSLOduration=156.716902478 podStartE2EDuration="2m36.716902478s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.716169468 +0000 UTC m=+228.937530366" watchObservedRunningTime="2026-03-20 10:58:24.716902478 +0000 UTC m=+228.938263366" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.744742 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" event={"ID":"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e","Type":"ContainerStarted","Data":"bf6e780cda4d0e96b97c12d1f930b8d9a9052663180824d5d07b084774f4f428"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.746010 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.747328 4860 patch_prober.go:28] interesting pod/console-operator-58897d9998-pc5tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.747384 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podUID="7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.751755 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.789134 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.791137 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"d6080f61cdb25b20af439b102bbb43cd441921ae90ae93f1c70d77b6ed385819"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.814074 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"993427f66306f40b17690f6f8ef81c6fa79154ebe2368ca11c2a070064244bc5"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.817541 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.819388 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.319368619 +0000 UTC m=+229.540729517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.850626 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4xrf" event={"ID":"37b4c0fc-6a82-4f6b-85fc-233090358f9c","Type":"ContainerStarted","Data":"8351d681bf6b9870f3f400882880d24a4cf069dce42ced86209a6c13b4ce520f"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.872536 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerStarted","Data":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.873748 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.899428 4860 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-srz5x container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.899522 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.908603 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"420a5dc4ad075a9c34e1a84426f4130ed1e36ffc575249e97d527a7c520af8f2"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.911749 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.918879 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.920847 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.420825171 +0000 UTC m=+229.642186069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.943353 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.946876 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.948028 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.948965 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerStarted","Data":"72e1e1c0612e639b5d9b1dd93371fee28768245c503b21f6343128336d8f4145"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.966942 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podStartSLOduration=156.966919973 podStartE2EDuration="2m36.966919973s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.946899826 +0000 UTC m=+229.168260724" watchObservedRunningTime="2026-03-20 10:58:24.966919973 +0000 UTC m=+229.188280871" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.967191 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.985192 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"645536bd2ef9c23dcee61a179ce48f4b51cec83f50bb54a946b11237357fa0e2"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.985367 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"c2e08f4783ce6285a82d3ffbfb2feb47cf7a513b1f9869344d1f6adeade4ec65"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.001317 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.011382 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.013212 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.014079 4860 generic.go:334] "Generic (PLEG): container finished" podID="f67156a9-f474-4d80-9789-ffbfcc9ec78b" containerID="b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2" exitCode=0 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.016578 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerDied","Data":"b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.043810 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"a7ed0d3f65c76368cac45c0ff7b006b217e10b81eb81c7216943753d9981573c"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.043853 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.025390 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.525359329 +0000 UTC m=+229.746720227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.024932 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.036076 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:25 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:25 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:25 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.044120 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.058086 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.061097 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.063062 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"6212a5d7099084ae132439cdd3e54e16ca1b64ce00eb7e8d3db188b218442702"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.063843 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" containerID="cri-o://80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" gracePeriod=30 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.068117 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.069062 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.094202 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" podStartSLOduration=157.094154472 podStartE2EDuration="2m37.094154472s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.049876301 +0000 UTC m=+229.271237199" watchObservedRunningTime="2026-03-20 10:58:25.094154472 +0000 UTC m=+229.315515370" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.113090 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" podStartSLOduration=156.113061378 podStartE2EDuration="2m36.113061378s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.098647147 +0000 UTC m=+229.320008065" watchObservedRunningTime="2026-03-20 10:58:25.113061378 +0000 UTC m=+229.334422276" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.115937 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.132453 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:42172->10.217.0.7:8443: read: connection reset by peer" start-of-body= Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.132560 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:42172->10.217.0.7:8443: read: connection reset by peer" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.140664 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" podStartSLOduration=156.140579864 podStartE2EDuration="2m36.140579864s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.129479955 +0000 UTC m=+229.350840873" watchObservedRunningTime="2026-03-20 10:58:25.140579864 +0000 UTC m=+229.361940792" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.152477 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.152834 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.652644609 +0000 UTC m=+229.874005507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.153056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.159481 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.659453629 +0000 UTC m=+229.880814537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.207722 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.227548 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.246931 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.252769 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.262390 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.263539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.763510713 +0000 UTC m=+229.984871621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.287917 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.288748 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.289736 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podStartSLOduration=157.289720092 podStartE2EDuration="2m37.289720092s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.207593658 +0000 UTC m=+229.428954576" watchObservedRunningTime="2026-03-20 10:58:25.289720092 +0000 UTC m=+229.511080990" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.293986 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x4xrf" podStartSLOduration=6.293978841 podStartE2EDuration="6.293978841s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.247130157 +0000 UTC m=+229.468491055" watchObservedRunningTime="2026-03-20 10:58:25.293978841 +0000 UTC m=+229.515339739" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.295281 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" podStartSLOduration=157.295276277 podStartE2EDuration="2m37.295276277s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.287632554 +0000 UTC m=+229.508993452" watchObservedRunningTime="2026-03-20 10:58:25.295276277 +0000 UTC m=+229.516637175" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.296010 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.330568 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403ca5f6_bd52_40de_88d6_5151b3202c76.slice/crio-2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93 WatchSource:0}: Error finding container 2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93: Status 404 returned error can't find the container with id 2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.337714 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825c6b77_c03a_463c_b9a4_d26a1ac398f0.slice/crio-c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93 WatchSource:0}: Error finding container c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93: Status 404 returned error can't find the container with id c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.338754 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628f2025_d050_42a9_bf56_9daa0e5c001b.slice/crio-b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950 WatchSource:0}: Error finding container b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950: Status 404 returned error can't find the container with id b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.366173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.366698 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.866678483 +0000 UTC m=+230.088039391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.417676 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d98ac55_cf65_4f72_805b_dd3da2742004.slice/crio-311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4 WatchSource:0}: Error finding container 311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4: Status 404 returned error can't find the container with id 311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.418599 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d5eff6_150c_4314_8ebc_38b3660ce01a.slice/crio-3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921 WatchSource:0}: Error finding container 3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921: Status 404 returned error can't find the container with id 3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.467109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.467478 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.967461176 +0000 UTC m=+230.188822074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.570774 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.571437 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.071423658 +0000 UTC m=+230.292784556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.672140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.672609 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.172591502 +0000 UTC m=+230.393952400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.773743 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.774193 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.274176718 +0000 UTC m=+230.495537616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.877383 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.877570 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.377536653 +0000 UTC m=+230.598897561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.882777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.883508 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.383458488 +0000 UTC m=+230.604819566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.984277 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.984454 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.484411986 +0000 UTC m=+230.705772884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.985003 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.985378 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.485363282 +0000 UTC m=+230.706724180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.027042 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:26 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:26 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:26 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.027113 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.086391 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.087021 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.586995479 +0000 UTC m=+230.808356387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.102893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerStarted","Data":"07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.106008 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt65f" event={"ID":"1241cd05-23d3-4e5a-9130-29e7638003a9","Type":"ContainerStarted","Data":"469c0252f0fca726d532459416225c858cce28407a228aecb8dbe49aaa2ec784"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.110019 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" event={"ID":"8bc351c5-b724-443e-a7e2-f4abba352cef","Type":"ContainerStarted","Data":"5ad7e44b418ec559bc4d49c59ea906e7ba235765030137c43979015514a1ad12"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.114873 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" event={"ID":"8b0b480d-ae68-4b26-b9f8-6b3caef70971","Type":"ContainerStarted","Data":"414fbe8b6cfca314e3587fdb3aad4f95463437a4d6cb42d157c9e579cbdf1913"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.114950 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" event={"ID":"8b0b480d-ae68-4b26-b9f8-6b3caef70971","Type":"ContainerStarted","Data":"b20f594348279afb8642d5a7ff43240cbd15f7eb43febdb9945c4732413e3e3f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.135285 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"6428b0277a7a5be9ac441d1fc30231736c58ad72e7183e9a00e9f736ef50f66b"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.144273 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" podStartSLOduration=158.144242491 podStartE2EDuration="2m38.144242491s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.142805571 +0000 UTC m=+230.364166489" watchObservedRunningTime="2026-03-20 10:58:26.144242491 +0000 UTC m=+230.365603389" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.162706 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" event={"ID":"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a","Type":"ContainerStarted","Data":"cf9fafd2774029771c6ee72f7fe27ae302a8ed4ae12276c15a8e533c4a4c9ef5"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.169932 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerStarted","Data":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.183208 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" podStartSLOduration=157.183181594 podStartE2EDuration="2m37.183181594s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.177857746 +0000 UTC m=+230.399218644" watchObservedRunningTime="2026-03-20 10:58:26.183181594 +0000 UTC m=+230.404542492" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.189316 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.190208 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" event={"ID":"e8d5eff6-150c-4314-8ebc-38b3660ce01a","Type":"ContainerStarted","Data":"3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921"} Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.191279 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.691256259 +0000 UTC m=+230.912617347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.201798 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"403212a771580bd8fc898ce5ac315f5a5cd01c0d5a1df90ecb9c00b992c1d0b0"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.201856 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"32a5532bfab4a262174bf80ed87400cfb109e79cecd3a1c5897723d7a5608d4b"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.208321 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" event={"ID":"b2a9fedf-d226-4388-8432-b22efd3b74bb","Type":"ContainerStarted","Data":"a26cc8b3d618a3b1bf4af57d50283a08c621d22a73263150bbca7713001da4fd"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.239202 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" event={"ID":"34043403-110c-4547-81a4-7af1429878cd","Type":"ContainerStarted","Data":"d84af7f21702b51abdfa0256281c3d2a386cc5e95a78e3ab30c2ed9df712c356"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.243336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.259510 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerStarted","Data":"ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.263070 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"e80b807de4a3049e5057d7d45493e0587164dea0c8c1edd9689b188800436526"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.263128 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"1be8f50e5be893519c62ef1a0fe4717331250880581423c364caac3b6d6e6db5"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.273939 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" event={"ID":"8fe93f79-239c-4b6a-bd22-bbdf55aff0af","Type":"ContainerStarted","Data":"10182e252a57c4dee35dd79f35e563d449506e424f564e1d6f77f6f936a8157f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.281980 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.286092 4860 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gf5nr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.286947 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" podUID="8fe93f79-239c-4b6a-bd22-bbdf55aff0af" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.293564 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.296657 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.796617849 +0000 UTC m=+231.017978747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.317036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"53052c3063c156aedf31adf1af52de60aeafbc6346da82a71c98e0b304e2fb37"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.322379 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sqrz5" podStartSLOduration=158.322354475 podStartE2EDuration="2m38.322354475s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.219079253 +0000 UTC m=+230.440440171" watchObservedRunningTime="2026-03-20 10:58:26.322354475 +0000 UTC m=+230.543715373" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.323328 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" podStartSLOduration=157.323320442 podStartE2EDuration="2m37.323320442s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.312534162 +0000 UTC m=+230.533895090" watchObservedRunningTime="2026-03-20 10:58:26.323320442 +0000 UTC m=+230.544681340" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.334815 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" event={"ID":"9d98ac55-cf65-4f72-805b-dd3da2742004","Type":"ContainerStarted","Data":"311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385566 4860 generic.go:334] "Generic (PLEG): container finished" podID="a8f2eaf6-3749-4695-8df1-5972598c8ac6" containerID="abb213a6d9940d2db7762d80a9868c4056074390a627a64a947a21d157bba659" exitCode=0 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385819 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerDied","Data":"abb213a6d9940d2db7762d80a9868c4056074390a627a64a947a21d157bba659"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385905 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"50bed181877a9344dfc66eb7fa4304ed83502e60d2e2274f0ed4146f68a4d2cf"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.402477 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.403462 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.903213974 +0000 UTC m=+231.124574872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.447387 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" event={"ID":"f1a219c6-74dc-4511-867e-cf2fce301cad","Type":"ContainerStarted","Data":"6cb01048ec791d0ae2ded19a0df5345b6d32b405d6fac7dec1f065ca6e38ec54"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.447775 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" event={"ID":"f1a219c6-74dc-4511-867e-cf2fce301cad","Type":"ContainerStarted","Data":"5d6c4e62cf466546c0a82273a8982d2bba36570338fc813f660a5b88b97fa102"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.474198 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.483976 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" podStartSLOduration=157.48394643 podStartE2EDuration="2m37.48394643s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.473728446 +0000 UTC m=+230.695089364" watchObservedRunningTime="2026-03-20 10:58:26.48394643 +0000 UTC m=+230.705307328" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.492296 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.493800 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"17a7c8b3edce8ec8ad08fac4b00ce0baebdf9ba0dbcbb8fc7ee67ec4d729c98c"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.496382 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"df633c8b9afba10b54e1f937581b5c2eff7f9fa11d295ff0623574ab82c94099"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.501414 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"226188a14343e1cc22dbbf45965258ca3dd238d77f3f2111ca764af7dbf3d8d7"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.504113 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.505209 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.005189111 +0000 UTC m=+231.226549999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.548658 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerStarted","Data":"e28d29a9ed40d0a5cdf5cae2c59f8688f31f4b2d19141e3c1080df3b1880da1f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.552552 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.552893 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.552924 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.553055 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.553659 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.589141 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" podStartSLOduration=158.589115856 podStartE2EDuration="2m38.589115856s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.589078065 +0000 UTC m=+230.810438963" watchObservedRunningTime="2026-03-20 10:58:26.589115856 +0000 UTC m=+230.810476754" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607257 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607346 4860 generic.go:334] "Generic (PLEG): container finished" podID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerID="80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" exitCode=0 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607536 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607531 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerDied","Data":"80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607599 4860 scope.go:117] "RemoveContainer" containerID="80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.608423 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610551 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610740 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610777 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610820 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610948 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611114 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611244 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611326 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611391 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611509 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config" (OuterVolumeSpecName: "config") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611999 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.615557 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.616371 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.116347583 +0000 UTC m=+231.337708481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.624513 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j" (OuterVolumeSpecName: "kube-api-access-x7g2j") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "kube-api-access-x7g2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.636935 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.694376 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"fef511d5e2c4298672a21a0b504a82b2f3a7318dc0d3e67b676816aca13424f3"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.706052 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" podStartSLOduration=157.705999507 podStartE2EDuration="2m37.705999507s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.697943353 +0000 UTC m=+230.919304271" watchObservedRunningTime="2026-03-20 10:58:26.705999507 +0000 UTC m=+230.927360425" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.718770 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.718949 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.218918986 +0000 UTC m=+231.440279884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719036 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719128 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719186 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719379 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719399 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719410 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719420 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.720124 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.220102059 +0000 UTC m=+231.441462957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.720925 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.721018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.721573 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.735576 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.752802 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.753654 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.771610 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.772066 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.772118 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.773725 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" event={"ID":"efda2c60-f018-417a-a73d-2727be57b558","Type":"ContainerStarted","Data":"6776c7db4ac76edf11af1347a2ba1d42e85d0db1f3adcb8253de7bf78e663beb"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.773768 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" event={"ID":"efda2c60-f018-417a-a73d-2727be57b558","Type":"ContainerStarted","Data":"b30c8b5006161bc9927b1750dd350c7c3092a5250d3aab3e6fff675f0246db09"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.778651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" event={"ID":"f5043c51-bd3f-461f-b011-a42ad38ed7d4","Type":"ContainerStarted","Data":"4a60f0b7f3637689f96bd20d58159e91d16d4c90136b6d4143847488cfc66c75"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.779874 4860 patch_prober.go:28] interesting pod/console-operator-58897d9998-pc5tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.779915 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podUID="7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.784796 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" containerID="cri-o://772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" gracePeriod=30 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.813383 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-45vfv" podStartSLOduration=158.813351223 podStartE2EDuration="2m38.813351223s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.796728841 +0000 UTC m=+231.018089739" watchObservedRunningTime="2026-03-20 10:58:26.813351223 +0000 UTC m=+231.034712121" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.821203 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.821736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.321707136 +0000 UTC m=+231.543068034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.869151 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" podStartSLOduration=157.869084224 podStartE2EDuration="2m37.869084224s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.841336102 +0000 UTC m=+231.062697020" watchObservedRunningTime="2026-03-20 10:58:26.869084224 +0000 UTC m=+231.090445122" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.877204 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" podStartSLOduration=157.877173018 podStartE2EDuration="2m37.877173018s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.873057414 +0000 UTC m=+231.094418312" watchObservedRunningTime="2026-03-20 10:58:26.877173018 +0000 UTC m=+231.098533916" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.924182 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.927625 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.927978 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.427961901 +0000 UTC m=+231.649322799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.941136 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.946723 4860 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-twkfs container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.946802 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" podUID="0eff7ea5-251b-44de-b129-c604349d6e6c" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.962708 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54584: no serving certificate available for the kubelet" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.975456 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.009863 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.013216 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.026368 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.026843 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.526807671 +0000 UTC m=+231.748168569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.027051 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:27 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:27 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:27 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.027099 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.079286 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54590: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.100941 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.131657 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.132032 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.632016947 +0000 UTC m=+231.853377845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.192195 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54600: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.233606 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.233989 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.733968593 +0000 UTC m=+231.955329491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.307867 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54612: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.336315 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.336663 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.836649039 +0000 UTC m=+232.058009937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.429512 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" path="/var/lib/kubelet/pods/6ef8eec8-b86d-4f5a-931e-c76e11c07f94/volumes" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.444696 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.445080 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.945056975 +0000 UTC m=+232.166417873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.446116 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54626: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.548562 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.549141 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.049097999 +0000 UTC m=+232.270458897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.585829 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.602140 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54632: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.651210 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.651643 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.151622511 +0000 UTC m=+232.372983409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.701357 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754792 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754880 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.755076 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.755484 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.756116 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config" (OuterVolumeSpecName: "config") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.756148 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca" (OuterVolumeSpecName: "client-ca") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.757417 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.257394823 +0000 UTC m=+232.478755721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.772744 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.778742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2" (OuterVolumeSpecName: "kube-api-access-d88n2") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "kube-api-access-d88n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.815427 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" event={"ID":"b2a9fedf-d226-4388-8432-b22efd3b74bb","Type":"ContainerStarted","Data":"29fa7dcbcc70b6de21bd09a99263130c3ad92f555e2f4e915a5fb6190ecc268c"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.858422 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860729 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860749 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860765 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860779 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.860858 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.36083513 +0000 UTC m=+232.582196028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.862618 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.862771 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.866973 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54642: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881183 4860 generic.go:334] "Generic (PLEG): container finished" podID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" exitCode=0 Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881321 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerDied","Data":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerDied","Data":"a1580a457002eb0c992197304d7aa1c99c6001d60b87a1e21dc5b0c8a7c76848"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881384 4860 scope.go:117] "RemoveContainer" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881440 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.897857 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt65f" event={"ID":"1241cd05-23d3-4e5a-9130-29e7638003a9","Type":"ContainerStarted","Data":"a031a1412ae277d24d326cdbaaf60a10073ce180edac74049c4bcbc1b4ab7884"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.913410 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"9452fecfc4671cbfdf74ab92361212b9bb37ce495999a676abc9da575051b245"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.915989 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"9291001c0f705a7789261ff5f2e547e4ab384e5c4dae053fe45cb20f4ceb5da4"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.933735 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"2dd52f924be48f54af4981b865933f6717b8def1257274a13547ce3d43e0cb71"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.933784 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"2f319b8086262e609f38a6e6210967d809d5447583848911397473b5f25f8b01"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.934323 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.953868 4860 scope.go:117] "RemoveContainer" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.959949 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": container with ID starting with 772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df not found: ID does not exist" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.960012 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} err="failed to get container status \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": rpc error: code = NotFound desc = could not find container \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": container with ID starting with 772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df not found: ID does not exist" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.963201 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.963844 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.463826755 +0000 UTC m=+232.685187653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.985077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"a81aa45637bb30ca12bbd6e9b9ccc6139664b58c32f277fca2d07ecbad1b1373"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.985553 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"5ad20e85aea3df60e61b092ce121c20cbeb6d50d9aa3330cbe520198797d07b3"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.986515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.043798 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:28 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:28 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:28 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.044986 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.051540 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"53ddd82409bd7ee46782b6056a3d3362a98c84210ccf946b74b468ffd1e7f06d"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.064361 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.065612 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.565589466 +0000 UTC m=+232.786950364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.087031 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"7445bc2e888e5beade54243d7a640ad66ebdd01e0dcbdbef3a62bad3d7f216f8"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.087082 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"097a19b0977f7b39523604e06b5911e68d3f8c53d1497fc3a9c3b86af15dc0ed"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.115112 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.129774 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.133271 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" event={"ID":"8bc351c5-b724-443e-a7e2-f4abba352cef","Type":"ContainerStarted","Data":"3430764001e19d4b3e2e51c6595c5ad940f0983e589f3610bbbdfe28e9d91d1d"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166347 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166757 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"841517bb1d97b11c96a8438b06509cff1b86584b43f982f2219e934bfe3b0d80"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166830 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"33c0b9f7f0bb92315706c73f4bd0e839fc0dee58a8b9016cc9e3acd93e4a9d65"} Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.168743 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.668690314 +0000 UTC m=+232.890051212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.197962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" event={"ID":"34043403-110c-4547-81a4-7af1429878cd","Type":"ContainerStarted","Data":"40266b1b2c1f3a20cf5bc717e097f2ddb0b6b2bc6f5f0f395b6eea3955ef2e54"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.199406 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.213063 4860 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vfkd4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.213133 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" podUID="34043403-110c-4547-81a4-7af1429878cd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.214038 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.225238 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.226498 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253541 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253618 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerStarted","Data":"509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253625 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.255823 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" podStartSLOduration=159.255793147 podStartE2EDuration="2m39.255793147s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.239152734 +0000 UTC m=+232.460513642" watchObservedRunningTime="2026-03-20 10:58:28.255793147 +0000 UTC m=+232.477154045" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.274947 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.276389 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.776372709 +0000 UTC m=+232.997733607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.303651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" event={"ID":"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a","Type":"ContainerStarted","Data":"8ee0e229e69a6f243d3f1e6dd07c122f01aaf2d19a61572557392196999c3a13"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.330013 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" podStartSLOduration=159.32999076 podStartE2EDuration="2m39.32999076s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.327729278 +0000 UTC m=+232.549090176" watchObservedRunningTime="2026-03-20 10:58:28.32999076 +0000 UTC m=+232.551351658" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.346679 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" event={"ID":"8fe93f79-239c-4b6a-bd22-bbdf55aff0af","Type":"ContainerStarted","Data":"61502ebc31445385563013c72335cb75130e8d70987eb86e2c3acd4a5cf1b222"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.382331 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.384321 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.884306821 +0000 UTC m=+233.105667719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.382215 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.406801 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" event={"ID":"e8d5eff6-150c-4314-8ebc-38b3660ce01a","Type":"ContainerStarted","Data":"b6f16e4fdc471b1f707e108886719c9840d2255ce081c81e52557545b38ad261"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.431584 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l2xf5" podStartSLOduration=9.431557996 podStartE2EDuration="9.431557996s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.429740145 +0000 UTC m=+232.651101043" watchObservedRunningTime="2026-03-20 10:58:28.431557996 +0000 UTC m=+232.652918894" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.447625 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54646: no serving certificate available for the kubelet" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.455828 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" event={"ID":"f5043c51-bd3f-461f-b011-a42ad38ed7d4","Type":"ContainerStarted","Data":"eb01d817668ae366fca7df1e5aa6b6cd1b6d88ee7369e6ccd483226e47ecce63"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.484967 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.485877 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.985856206 +0000 UTC m=+233.207217104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.509525 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerStarted","Data":"63a0cd6f2cc2d9eeeaa1bb8e6f130de3ebcf3e0a4cb9179f998ccf85f93ed02e"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.509584 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.531449 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.555327 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" event={"ID":"9d98ac55-cf65-4f72-805b-dd3da2742004","Type":"ContainerStarted","Data":"1d0d787c7aef31b6076c19907a676d3a10ac78299df210f8e57035654ff980a3"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.594396 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.597437 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.097416669 +0000 UTC m=+233.318777567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.630676 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" podStartSLOduration=159.630652974 podStartE2EDuration="2m39.630652974s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.567711843 +0000 UTC m=+232.789072741" watchObservedRunningTime="2026-03-20 10:58:28.630652974 +0000 UTC m=+232.852013872" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.632733 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wt65f" podStartSLOduration=9.632724861 podStartE2EDuration="9.632724861s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.621726655 +0000 UTC m=+232.843087573" watchObservedRunningTime="2026-03-20 10:58:28.632724861 +0000 UTC m=+232.854085759" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.702004 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.705188 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.205158146 +0000 UTC m=+233.426519044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.805976 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.809211 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.809867 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.309841278 +0000 UTC m=+233.531202176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.825430 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" podStartSLOduration=159.82536884 podStartE2EDuration="2m39.82536884s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.712978734 +0000 UTC m=+232.934339632" watchObservedRunningTime="2026-03-20 10:58:28.82536884 +0000 UTC m=+233.046729738" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.852830 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" podStartSLOduration=160.852803683 podStartE2EDuration="2m40.852803683s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.806808124 +0000 UTC m=+233.028169022" watchObservedRunningTime="2026-03-20 10:58:28.852803683 +0000 UTC m=+233.074164581" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.857916 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" podStartSLOduration=160.857897195 podStartE2EDuration="2m40.857897195s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.855353634 +0000 UTC m=+233.076714552" watchObservedRunningTime="2026-03-20 10:58:28.857897195 +0000 UTC m=+233.079258093" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.910207 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" podStartSLOduration=159.910177119 podStartE2EDuration="2m39.910177119s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.90840897 +0000 UTC m=+233.129769878" watchObservedRunningTime="2026-03-20 10:58:28.910177119 +0000 UTC m=+233.131538017" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.911021 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.911545 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.411523297 +0000 UTC m=+233.632884195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.975642 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" podStartSLOduration=159.97561765 podStartE2EDuration="2m39.97561765s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.972471892 +0000 UTC m=+233.193832810" watchObservedRunningTime="2026-03-20 10:58:28.97561765 +0000 UTC m=+233.196978538" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.009155 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" podStartSLOduration=160.009131562 podStartE2EDuration="2m40.009131562s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.00690497 +0000 UTC m=+233.228265868" watchObservedRunningTime="2026-03-20 10:58:29.009131562 +0000 UTC m=+233.230492460" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.015991 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.016375 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.516358913 +0000 UTC m=+233.737719811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.026573 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:29 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:29 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:29 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.026661 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030494 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.030747 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030871 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.031401 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.054933 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.054986 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055024 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055053 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055159 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.078500 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.096589 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.120202 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.120499 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.620470569 +0000 UTC m=+233.841831467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.120994 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121681 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.122190 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.622173666 +0000 UTC m=+233.843534564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.163723 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podStartSLOduration=4.163700511 podStartE2EDuration="4.163700511s" podCreationTimestamp="2026-03-20 10:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.159991688 +0000 UTC m=+233.381352586" watchObservedRunningTime="2026-03-20 10:58:29.163700511 +0000 UTC m=+233.385061409" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.187130 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54654: no serving certificate available for the kubelet" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.196417 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" podStartSLOduration=160.196395861 podStartE2EDuration="2m40.196395861s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.196167654 +0000 UTC m=+233.417528562" watchObservedRunningTime="2026-03-20 10:58:29.196395861 +0000 UTC m=+233.417756759" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224486 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224816 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224863 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224919 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224945 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.226257 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.226784 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.726755715 +0000 UTC m=+233.948116773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.226841 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.236395 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.257921 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" podStartSLOduration=160.257895762 podStartE2EDuration="2m40.257895762s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.257363877 +0000 UTC m=+233.478724785" watchObservedRunningTime="2026-03-20 10:58:29.257895762 +0000 UTC m=+233.479256660" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.278164 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.314426 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" podStartSLOduration=161.314393583 podStartE2EDuration="2m41.314393583s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.307581754 +0000 UTC m=+233.528942652" watchObservedRunningTime="2026-03-20 10:58:29.314393583 +0000 UTC m=+233.535754501" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.327123 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.327935 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.827910709 +0000 UTC m=+234.049271607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.359396 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podStartSLOduration=160.359364904 podStartE2EDuration="2m40.359364904s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.339168252 +0000 UTC m=+233.560529160" watchObservedRunningTime="2026-03-20 10:58:29.359364904 +0000 UTC m=+233.580725802" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.396612 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.431490 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.431997 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.931971394 +0000 UTC m=+234.153332292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.432096 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" path="/var/lib/kubelet/pods/f15060fa-5a28-4a12-be7b-2823e921eb90/volumes" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.533405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.533928 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.033909579 +0000 UTC m=+234.255270477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.601656 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerStarted","Data":"9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.610909 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"3ebdddedcf41b1b5a3ca26aed44754d8031631ac1336491746cb50ab65e583f6"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.622900 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"e9f15176d1fd4501f85ecbe88e065250d07fa7387e88e17ed25ec8434312c211"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.628591 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.628671 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.629963 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.630032 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.630265 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.648689 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.648824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.148800844 +0000 UTC m=+234.370161742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.649088 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.649543 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.149533744 +0000 UTC m=+234.370894642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.702880 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.743309 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" podStartSLOduration=161.743283052 podStartE2EDuration="2m41.743283052s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.701657624 +0000 UTC m=+233.923018522" watchObservedRunningTime="2026-03-20 10:58:29.743283052 +0000 UTC m=+233.964643950" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.751617 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.753573 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.253517197 +0000 UTC m=+234.474878085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.866853 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.867303 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.367288951 +0000 UTC m=+234.588649849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.976994 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.977647 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.477625681 +0000 UTC m=+234.698986579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.031261 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:30 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:30 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:30 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.031357 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.085366 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.085790 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.585773079 +0000 UTC m=+234.807133977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.136777 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.138005 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.142849 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.171639 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.177773 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.186553 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.187050 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.687003755 +0000 UTC m=+234.908364663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.187602 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290165 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290248 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290285 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290328 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.290690 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.790675369 +0000 UTC m=+235.012036267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.346563 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.347574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392218 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392509 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392576 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392626 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.392721 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.892677976 +0000 UTC m=+235.114038874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392930 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.393127 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.393560 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.393989 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.893963382 +0000 UTC m=+235.115324440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.494652 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.495063 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.995019063 +0000 UTC m=+235.216379961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495152 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495282 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495403 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495509 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.495859 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.995853066 +0000 UTC m=+235.217213964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597037 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.597296 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.097252776 +0000 UTC m=+235.318613674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597373 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597456 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597504 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.598615 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.598938 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.098922353 +0000 UTC m=+235.320283251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.599160 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.624792 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42882: no serving certificate available for the kubelet" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.649303 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerStarted","Data":"07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29"} Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.649380 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerStarted","Data":"7aa219357098d2e5cc353f906dff76cf1a673bc6396fbbfabef96091000e9adc"} Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.650874 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.651315 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.699325 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.699716 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.199698996 +0000 UTC m=+235.421059894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.745120 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.752044 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.752170 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.755533 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.762797 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.763332 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.764560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.767887 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.801923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.802383 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.302364622 +0000 UTC m=+235.523725520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.823328 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.824461 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.854876 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.914897 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915161 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915209 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915258 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915284 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915302 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915366 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.915506 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.415484658 +0000 UTC m=+235.636845566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.934457 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podStartSLOduration=5.934433086 podStartE2EDuration="5.934433086s" podCreationTimestamp="2026-03-20 10:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:30.901439068 +0000 UTC m=+235.122799966" watchObservedRunningTime="2026-03-20 10:58:30.934433086 +0000 UTC m=+235.155793984" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.982127 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017414 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017502 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017550 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017586 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017635 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017658 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.021948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.021943 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.022764 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.522745982 +0000 UTC m=+235.744106880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.032883 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.033548 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.039652 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:31 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:31 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:31 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.039727 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.055332 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.066334 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.118981 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.119862 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.619844183 +0000 UTC m=+235.841205081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.162069 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.179701 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.222121 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.222566 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.72254957 +0000 UTC m=+235.943910468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.322934 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.323352 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.823334523 +0000 UTC m=+236.044695421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.426757 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.427122 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.92710795 +0000 UTC m=+236.148468848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.540650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.540857 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.040819033 +0000 UTC m=+236.262179931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.541804 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.542328 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.042314105 +0000 UTC m=+236.263675003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.645055 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.645579 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.145538946 +0000 UTC m=+236.366899844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.645695 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.646273 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.146265386 +0000 UTC m=+236.367626284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.713965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.716302 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"ef24e9f8d4b6a43cc833f7f0d60bbe74efda0e84841e788eaabadc9fd54fdd4f"} Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.716601 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.750123 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.750763 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.250733922 +0000 UTC m=+236.472094820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.764915 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.823673 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:31 crc kubenswrapper[4860]: W0320 10:58:31.826678 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2690d8b_c7f7_4e71_af44_33444e4d6187.slice/crio-d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc WatchSource:0}: Error finding container d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc: Status 404 returned error can't find the container with id d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.854027 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.855148 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.355130236 +0000 UTC m=+236.576491134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.902638 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.955495 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.955915 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.455890419 +0000 UTC m=+236.677251317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.965536 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.996158 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.011865 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.023305 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.030434 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:32 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.030485 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.071029 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.071468 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.571445843 +0000 UTC m=+236.792806741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.120656 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.121824 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.130831 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.143428 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174038 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.174363 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.674329975 +0000 UTC m=+236.895690873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174417 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174461 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174632 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.175050 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.675033575 +0000 UTC m=+236.896394473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.277900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283156 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283336 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283451 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.285307 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.285496 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.785457036 +0000 UTC m=+237.006817934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286110 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286664 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286710 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.356076 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.366915 4860 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vpp2k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]log ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]etcd ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/max-in-flight-filter ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 10:58:32 crc kubenswrapper[4860]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-startinformers ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 10:58:32 crc kubenswrapper[4860]: livez check failed Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.367006 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" podUID="a8f2eaf6-3749-4695-8df1-5972598c8ac6" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.367468 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.368915 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.385258 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.386122 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.886099866 +0000 UTC m=+237.107460764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388015 4860 patch_prober.go:28] interesting pod/console-f9d7485db-sqrz5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388064 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388530 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388553 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388649 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388729 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.460244 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.477903 4860 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.490021 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.490575 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.990549161 +0000 UTC m=+237.211910069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.550323 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.551668 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.585240 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596335 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596437 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596513 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.596873 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:33.096860188 +0000 UTC m=+237.318221086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698459 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698729 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698781 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698818 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.699382 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.699753 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:33.19973344 +0000 UTC m=+237.421094338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.699764 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.720610 4860 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T10:58:32.47793414Z","Handler":null,"Name":""} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.731988 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.733293 4860 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.733342 4860 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760045 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760199 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760293 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781344 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781848 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerStarted","Data":"a0b846fa7e38edf968a2ecba37cb073cb9550be5be8e0d139e448911ef2ef8dd"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.805176 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.826213 4860 generic.go:334] "Generic (PLEG): container finished" podID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerID="509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.826364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerDied","Data":"509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.836369 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.848984 4860 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.849045 4860 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.866215 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.921715 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"7115075fbd3ae4e5afa0a20202db6ef747ec3fe98f0eb9b25e8cd4d053503781"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938738 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938840 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938875 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"1f538c1360593e9a410b70b066b34c33f5665e2dac735a2212ce3b3dbdf2dce0"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.967217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.967828 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.973776 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.973974 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"711ef831caa70569060ba2dc068e9cede6a21ca93c6a666bf7abd4f4e2156736"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.011708 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.033838 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:33 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:33 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:33 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.033921 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.042115 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.062023 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.205574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.251424 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42898: no serving certificate available for the kubelet" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.314601 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.317529 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.320914 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.381020 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421110 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421171 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421207 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.444078 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.459484 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523089 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523330 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523600 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.525624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.525833 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.552489 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.600667 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.601655 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.605311 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.606136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.611212 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.625562 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.625631 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.660065 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.712443 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.716773 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.731121 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733797 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733893 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733940 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734064 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734134 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734274 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.755076 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.794200 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:33 crc kubenswrapper[4860]: W0320 10:58:33.801778 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b41087_226b_4f73_9fc4_64616b430f2d.slice/crio-e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b WatchSource:0}: Error finding container e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b: Status 404 returned error can't find the container with id e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.835979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.836132 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.836243 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.837102 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.837444 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.865173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.946441 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.975357 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" exitCode=0 Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.975551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.977265 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerStarted","Data":"39591daa264ba7bebe5fdc529015addd733110c1c54ed6b98d8a162a754e8d60"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984088 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772" exitCode=0 Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984272 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984329 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.013189 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.016803 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.019887 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.019990 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.022940 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.022993 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"6ae55a6ee512bab12f957da5d1b95423a1a9d0446f4cc36c58cf532984c034ba"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.024759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerStarted","Data":"e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.027293 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.032391 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:34 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:34 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:34 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.032480 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.042083 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.042750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.078660 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.116077 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" podStartSLOduration=15.116046746 podStartE2EDuration="15.116046746s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:34.108181307 +0000 UTC m=+238.329542215" watchObservedRunningTime="2026-03-20 10:58:34.116046746 +0000 UTC m=+238.337407644" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.150063 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.150365 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.151820 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.185492 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.264891 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.349330 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.465978 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557026 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557308 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.558472 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume" (OuterVolumeSpecName: "config-volume") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.561294 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.565438 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.565626 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d" (OuterVolumeSpecName: "kube-api-access-wpk7d") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "kube-api-access-wpk7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: W0320 10:58:34.630870 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81a43aa_2c39_4d49_8526_f097322dd7bf.slice/crio-b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604 WatchSource:0}: Error finding container b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604: Status 404 returned error can't find the container with id b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604 Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660046 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660160 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660175 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.707899 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: W0320 10:58:34.835535 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podee04bfe1_80c1_43ea_9c2f_a8dde5f81388.slice/crio-da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f WatchSource:0}: Error finding container da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f: Status 404 returned error can't find the container with id da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.026642 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:35 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:35 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:35 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.026700 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.045125 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerStarted","Data":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.047578 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.049030 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerStarted","Data":"72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.049077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerStarted","Data":"b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.057667 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerStarted","Data":"da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061149 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerDied","Data":"07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061250 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064262 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" exitCode=0 Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064404 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"db09468d977aabd81ce312da99eaa8c50b25e5282affd310a612fbfda038e94c"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.077397 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.077446 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.079698 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" podStartSLOduration=167.079670921 podStartE2EDuration="2m47.079670921s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:35.072619655 +0000 UTC m=+239.293980553" watchObservedRunningTime="2026-03-20 10:58:35.079670921 +0000 UTC m=+239.301031819" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.122446 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.122415 podStartE2EDuration="2.122415s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:35.121031871 +0000 UTC m=+239.342392789" watchObservedRunningTime="2026-03-20 10:58:35.122415 +0000 UTC m=+239.343775898" Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.025608 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:36 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:36 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:36 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.025699 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.103917 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerStarted","Data":"58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.108526 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6" exitCode=0 Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.108595 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.124121 4860 generic.go:334] "Generic (PLEG): container finished" podID="c7f16bf2-db43-4057-9961-ef03202f7828" containerID="72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0" exitCode=0 Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.124736 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerDied","Data":"72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.173592 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.1735413980000002 podStartE2EDuration="3.173541398s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:36.135320615 +0000 UTC m=+240.356681523" watchObservedRunningTime="2026-03-20 10:58:36.173541398 +0000 UTC m=+240.394902296" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.025378 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:37 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:37 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:37 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.025459 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.153739 4860 generic.go:334] "Generic (PLEG): container finished" podID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerID="58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8" exitCode=0 Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.156119 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerDied","Data":"58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8"} Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.288883 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.299945 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.703411 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866094 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"c7f16bf2-db43-4057-9961-ef03202f7828\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866175 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"c7f16bf2-db43-4057-9961-ef03202f7828\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866724 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7f16bf2-db43-4057-9961-ef03202f7828" (UID: "c7f16bf2-db43-4057-9961-ef03202f7828"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.884572 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7f16bf2-db43-4057-9961-ef03202f7828" (UID: "c7f16bf2-db43-4057-9961-ef03202f7828"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.968243 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.968784 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.030744 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:38 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:38 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:38 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.030824 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184231 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184256 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerDied","Data":"b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3"} Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184335 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.190295 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.415319 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42910: no serving certificate available for the kubelet" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.517095 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692302 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692448 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" (UID: "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692900 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.699519 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" (UID: "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.793977 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.025411 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:39 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:39 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:39 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.025510 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227611 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerDied","Data":"da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f"} Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227669 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227728 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.023537 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:40 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:40 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:40 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.023611 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.421018 4860 ???:1] "http: TLS handshake error from 192.168.126.11:38984: no serving certificate available for the kubelet" Mar 20 10:58:41 crc kubenswrapper[4860]: I0320 10:58:41.026531 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:41 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:41 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:41 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:41 crc kubenswrapper[4860]: I0320 10:58:41.026608 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.023770 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:42 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:42 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:42 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.024104 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.363033 4860 patch_prober.go:28] interesting pod/console-f9d7485db-sqrz5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.363102 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390908 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390953 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390978 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.391009 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.024275 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:43 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:43 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:43 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.024404 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.183645 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.185363 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.204929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.439208 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.446270 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.024567 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:44 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:44 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:44 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.024644 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.167108 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.167394 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" containerID="cri-o://9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" gracePeriod=30 Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.196751 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.197010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" containerID="cri-o://07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" gracePeriod=30 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.024731 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.028676 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.305588 4860 generic.go:334] "Generic (PLEG): container finished" podID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerID="9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.305701 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerDied","Data":"9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31"} Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.309373 4860 generic.go:334] "Generic (PLEG): container finished" podID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerID="07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.309433 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerDied","Data":"07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29"} Mar 20 10:58:46 crc kubenswrapper[4860]: I0320 10:58:46.977536 4860 patch_prober.go:28] interesting pod/controller-manager-9ffd4b47b-9qh65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 10:58:46 crc kubenswrapper[4860]: I0320 10:58:46.977638 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 10:58:48 crc kubenswrapper[4860]: I0320 10:58:48.681698 4860 ???:1] "http: TLS handshake error from 192.168.126.11:38990: no serving certificate available for the kubelet" Mar 20 10:58:49 crc kubenswrapper[4860]: I0320 10:58:49.397699 4860 patch_prober.go:28] interesting pod/route-controller-manager-5dc8897f6c-8dhfx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 10:58:49 crc kubenswrapper[4860]: I0320 10:58:49.397787 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.344928 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.345026 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.388885 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389346 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.388885 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389419 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389457 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389937 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390008 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390339 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} pod="openshift-console/downloads-7954f5f757-45vfv" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390404 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" containerID="cri-o://881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d" gracePeriod=2 Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.627495 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.633937 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.210653 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.374186 4860 generic.go:334] "Generic (PLEG): container finished" podID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerID="881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d" exitCode=0 Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.374291 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerDied","Data":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} Mar 20 10:58:57 crc kubenswrapper[4860]: I0320 10:58:57.978061 4860 patch_prober.go:28] interesting pod/controller-manager-9ffd4b47b-9qh65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:58:57 crc kubenswrapper[4860]: I0320 10:58:57.978662 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:58:59 crc kubenswrapper[4860]: I0320 10:58:59.397921 4860 patch_prober.go:28] interesting pod/route-controller-manager-5dc8897f6c-8dhfx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 10:58:59 crc kubenswrapper[4860]: I0320 10:58:59.398064 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.397177 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.399000 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.632706 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.677824 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678175 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678191 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678217 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678259 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678282 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678291 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678299 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678307 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678641 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678659 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678668 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678678 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.679181 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.692032 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.763553 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.771937 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772052 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772075 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772104 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772176 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773017 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773467 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773975 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774091 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774132 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config" (OuterVolumeSpecName: "config") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774182 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774295 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774333 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774429 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774443 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774454 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.790995 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss" (OuterVolumeSpecName: "kube-api-access-bdqss") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "kube-api-access-bdqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.791602 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876113 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876208 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876337 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876371 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876427 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876445 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.877351 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.877568 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.878043 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.886045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.895203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.007863 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.255064 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385398 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385555 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.386691 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config" (OuterVolumeSpecName: "config") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.386778 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca" (OuterVolumeSpecName: "client-ca") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.390503 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.394629 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd" (OuterVolumeSpecName: "kube-api-access-n7npd") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "kube-api-access-n7npd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.451866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerDied","Data":"7aa219357098d2e5cc353f906dff76cf1a673bc6396fbbfabef96091000e9adc"} Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.451945 4860 scope.go:117] "RemoveContainer" containerID="07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.452060 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.456717 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerDied","Data":"63a0cd6f2cc2d9eeeaa1bb8e6f130de3ebcf3e0a4cb9179f998ccf85f93ed02e"} Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.456829 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.479157 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.483053 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489029 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489059 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489071 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489080 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.496660 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.499799 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.134291 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.268625 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.268954 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.268971 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.269107 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.269707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.272799 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273016 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273217 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273565 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.284618 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.285557 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.291072 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404200 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404277 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404497 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.405163 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506760 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506835 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506904 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506932 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.508710 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.509208 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.521481 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.529044 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.551708 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.551909 4860 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:59:04 crc kubenswrapper[4860]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 10:59:04 crc kubenswrapper[4860]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hjpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566738-5cj22_openshift-infra(ba2ab33e-6ecc-4eac-9aaa-256e6ff68236): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 10:59:04 crc kubenswrapper[4860]: > logger="UnhandledError" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.553074 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.586096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:05 crc kubenswrapper[4860]: I0320 10:59:05.419923 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" path="/var/lib/kubelet/pods/215a61d8-f0e1-419d-b4cb-8ddc801d5a79/volumes" Mar 20 10:59:05 crc kubenswrapper[4860]: I0320 10:59:05.420537 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" path="/var/lib/kubelet/pods/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0/volumes" Mar 20 10:59:05 crc kubenswrapper[4860]: E0320 10:59:05.478924 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.823378 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.826354 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.827373 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.831409 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.831485 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.948482 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.948627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049847 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049986 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.070001 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.150423 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.407474 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.408031 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg46p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d9xlp_openshift-marketplace(f20cb95e-5480-4c9c-859f-0b03d679ab06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.409373 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.334955 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.432937 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.433506 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h2z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w5w95_openshift-marketplace(4f84f111-5991-4e78-9508-82283b8e36f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.434720 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.479723 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.479944 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-77rcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p6sh9_openshift-marketplace(7b622f82-e01c-42b8-8061-16b6e8f551fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.481304 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.498262 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.498463 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rln5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5jpww_openshift-marketplace(2268b7ae-c1db-4ef4-8236-60f7cfa277a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.502442 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" Mar 20 10:59:11 crc kubenswrapper[4860]: I0320 10:59:11.566466 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:59:12 crc kubenswrapper[4860]: I0320 10:59:12.389257 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:12 crc kubenswrapper[4860]: I0320 10:59:12.389321 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.948329 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.951482 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.951569 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" Mar 20 10:59:13 crc kubenswrapper[4860]: I0320 10:59:12.955294 4860 scope.go:117] "RemoveContainer" containerID="9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.087645 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.087908 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fks5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-n79b7_openshift-marketplace(d2690d8b-c7f7-4e71-af44-33444e4d6187): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.089458 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.207660 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.208951 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.211143 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328030 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328568 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430363 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430454 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430532 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430597 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.456252 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.541847 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.069361 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.070177 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035f0b3d_92ee_4564_8dad_28b231e1c800.slice/crio-32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff WatchSource:0}: Error finding container 32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff: Status 404 returned error can't find the container with id 32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.245490 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.246142 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mn9d7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jx27x_openshift-marketplace(f81a43aa-2c39-4d49-8526-f097322dd7bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.247958 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.267806 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.268058 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdzkg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qq8bh_openshift-marketplace(514f05c3-1404-46c6-9f4d-68437ea8ee0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.269195 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.283138 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.283331 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhz7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r7ckk_openshift-marketplace(f0e14a08-824b-450f-bf98-2a476da0d44b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.285435 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.538194 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.562714 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"554d77b0c1bd97b8a6708815bacc2eb2bc0081bc636f5dec384b6a541029bbe9"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.562763 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.566543 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"ccdbfdc51f663e0a673d48a4d03c4efc47d1bc66fe97ee784d2b2cb54a8d3d07"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567397 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567403 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567443 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.572892 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.573165 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.573261 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.624427 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.666149 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.682583 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.699858 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dfd8c66_3b51_4cf2_acbb_eb764785f6d3.slice/crio-c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079 WatchSource:0}: Error finding container c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079: Status 404 returned error can't find the container with id c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079 Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.700945 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0ce2f8_6bac_4fd7_81ad_2478d13e62c9.slice/crio-fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868 WatchSource:0}: Error finding container fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868: Status 404 returned error can't find the container with id fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868 Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.603151 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerStarted","Data":"266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerStarted","Data":"19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerStarted","Data":"16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605616 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605716 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerStarted","Data":"fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.608069 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerStarted","Data":"8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.608139 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerStarted","Data":"e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609859 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerStarted","Data":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609892 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerStarted","Data":"c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609969 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" containerID="cri-o://7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" gracePeriod=30 Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.610126 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.615787 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"c4be3c5de2033f35854cf2c0ff9946500d6c0c890b9bd6674aa13c8d1bf3d782"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.616710 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.618264 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.618309 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.632592 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.632571202 podStartE2EDuration="10.632571202s" podCreationTimestamp="2026-03-20 10:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.631409519 +0000 UTC m=+282.852770417" watchObservedRunningTime="2026-03-20 10:59:18.632571202 +0000 UTC m=+282.853932100" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.653392 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podStartSLOduration=14.653366431 podStartE2EDuration="14.653366431s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.649544813 +0000 UTC m=+282.870905731" watchObservedRunningTime="2026-03-20 10:59:18.653366431 +0000 UTC m=+282.874727339" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.668284 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q85gq" podStartSLOduration=210.668256113 podStartE2EDuration="3m30.668256113s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.667625636 +0000 UTC m=+282.888986534" watchObservedRunningTime="2026-03-20 10:59:18.668256113 +0000 UTC m=+282.889617011" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.685674 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" podStartSLOduration=34.685644986 podStartE2EDuration="34.685644986s" podCreationTimestamp="2026-03-20 10:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.682678112 +0000 UTC m=+282.904039010" watchObservedRunningTime="2026-03-20 10:59:18.685644986 +0000 UTC m=+282.907005884" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.757092 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.544948 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573504 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:19 crc kubenswrapper[4860]: E0320 10:59:19.573821 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573840 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573991 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.574538 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.591561 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.624527 4860 generic.go:334] "Generic (PLEG): container finished" podID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerID="266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2" exitCode=0 Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.624608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerDied","Data":"266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.625921 4860 generic.go:334] "Generic (PLEG): container finished" podID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" exitCode=0 Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626564 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerDied","Data":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626694 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerDied","Data":"c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626712 4860 scope.go:117] "RemoveContainer" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.628004 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.628048 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.650649 4860 scope.go:117] "RemoveContainer" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: E0320 10:59:19.651283 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": container with ID starting with 7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f not found: ID does not exist" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.651329 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} err="failed to get container status \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": rpc error: code = NotFound desc = could not find container \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": container with ID starting with 7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f not found: ID does not exist" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.666185 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.66615936 podStartE2EDuration="5.66615936s" podCreationTimestamp="2026-03-20 10:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:19.664639097 +0000 UTC m=+283.885999995" watchObservedRunningTime="2026-03-20 10:59:19.66615936 +0000 UTC m=+283.887520258" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719463 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719542 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719583 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719753 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719798 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720315 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720361 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720394 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720443 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720525 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721156 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721163 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721218 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config" (OuterVolumeSpecName: "config") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.726962 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq" (OuterVolumeSpecName: "kube-api-access-fmctq") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "kube-api-access-fmctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.727784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.821683 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822133 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822241 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822321 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822452 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822642 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822750 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822829 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822896 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.823193 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.823715 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.824850 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.826278 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.826730 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.846123 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.898938 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.980982 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.993161 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.125048 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:20 crc kubenswrapper[4860]: W0320 10:59:20.125964 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a806156_ca3b_43dd_8b19_c072188004b7.slice/crio-bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b WatchSource:0}: Error finding container bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b: Status 404 returned error can't find the container with id bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.636529 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerStarted","Data":"8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa"} Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.637010 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerStarted","Data":"bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b"} Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.664238 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" podStartSLOduration=16.664193199 podStartE2EDuration="16.664193199s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:20.661239826 +0000 UTC m=+284.882600734" watchObservedRunningTime="2026-03-20 10:59:20.664193199 +0000 UTC m=+284.885554097" Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.915990 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.056927 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"da7fc050-0408-49bf-a97d-3b5935573dc7\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.058390 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"da7fc050-0408-49bf-a97d-3b5935573dc7\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.058863 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "da7fc050-0408-49bf-a97d-3b5935573dc7" (UID: "da7fc050-0408-49bf-a97d-3b5935573dc7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.070849 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "da7fc050-0408-49bf-a97d-3b5935573dc7" (UID: "da7fc050-0408-49bf-a97d-3b5935573dc7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.159921 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.159971 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.419993 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" path="/var/lib/kubelet/pods/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3/volumes" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644469 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerDied","Data":"19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e"} Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644530 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.646113 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.651263 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.344688 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345176 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345260 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345838 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345902 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" gracePeriod=600 Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.388493 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.389191 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.388512 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.389423 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.651254 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" exitCode=0 Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.652059 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.151474 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.165463 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.165705 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" containerID="cri-o://16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" gracePeriod=30 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.579174 4860 csr.go:261] certificate signing request csr-bl4br is approved, waiting to be issued Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.586828 4860 patch_prober.go:28] interesting pod/route-controller-manager-56668659b9-pdhht container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.586906 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.590441 4860 csr.go:257] certificate signing request csr-bl4br is issued Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.665890 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.667743 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerStarted","Data":"133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669293 4860 generic.go:334] "Generic (PLEG): container finished" podID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerID="16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" exitCode=0 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669377 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerDied","Data":"16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669479 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" containerID="cri-o://8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" gracePeriod=30 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.704589 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podStartSLOduration=26.574909835 podStartE2EDuration="1m24.704568508s" podCreationTimestamp="2026-03-20 10:58:00 +0000 UTC" firstStartedPulling="2026-03-20 10:58:25.246833699 +0000 UTC m=+229.468194597" lastFinishedPulling="2026-03-20 10:59:23.376492372 +0000 UTC m=+287.597853270" observedRunningTime="2026-03-20 10:59:24.701073708 +0000 UTC m=+288.922434606" watchObservedRunningTime="2026-03-20 10:59:24.704568508 +0000 UTC m=+288.925929406" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.383257 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430488 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:25 crc kubenswrapper[4860]: E0320 10:59:25.430853 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430877 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: E0320 10:59:25.430895 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430906 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431063 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431083 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431738 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.434803 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438086 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438285 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438562 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.440883 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config" (OuterVolumeSpecName: "config") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.441461 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.447051 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.447195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6" (OuterVolumeSpecName: "kube-api-access-59vs6") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "kube-api-access-59vs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.540808 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541293 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541322 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541372 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541418 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541429 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541438 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541450 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.593268 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 23:10:15.981593354 +0000 UTC Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.593304 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6228h10m50.388291814s for next certificate rotation Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642353 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642397 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642449 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642506 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.643819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.648767 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.651543 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.661966 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.681096 4860 generic.go:334] "Generic (PLEG): container finished" podID="5a806156-ca3b-43dd-8b19-c072188004b7" containerID="8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" exitCode=0 Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.681185 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerDied","Data":"8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683265 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerDied","Data":"fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683325 4860 scope.go:117] "RemoveContainer" containerID="16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683549 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.690635 4860 generic.go:334] "Generic (PLEG): container finished" podID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerID="133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc" exitCode=0 Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.690881 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerDied","Data":"133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.750016 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.757048 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.765109 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.867064 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.955630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.955999 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956086 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956166 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957014 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957004 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957047 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config" (OuterVolumeSpecName: "config") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.963863 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l" (OuterVolumeSpecName: "kube-api-access-rsk6l") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "kube-api-access-rsk6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.964309 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.059248 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.059529 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062454 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062477 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062492 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.354176 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:26 crc kubenswrapper[4860]: W0320 10:59:26.365086 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd77d1cf_d25f_459c_95b4_96c63acd0462.slice/crio-e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98 WatchSource:0}: Error finding container e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98: Status 404 returned error can't find the container with id e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98 Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.603684 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-08 04:00:06.258231721 +0000 UTC Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.603739 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7049h0m39.654496559s for next certificate rotation Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.698127 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerStarted","Data":"7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.698186 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerStarted","Data":"e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.704192 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" exitCode=0 Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.704294 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706177 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerDied","Data":"bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706241 4860 scope.go:117] "RemoveContainer" containerID="8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706329 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.723034 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podStartSLOduration=2.723005612 podStartE2EDuration="2.723005612s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:26.719465382 +0000 UTC m=+290.940826290" watchObservedRunningTime="2026-03-20 10:59:26.723005612 +0000 UTC m=+290.944366520" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.771200 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.777417 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.073821 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.180418 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.189070 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq" (OuterVolumeSpecName: "kube-api-access-5hjpq") pod "ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" (UID: "ba2ab33e-6ecc-4eac-9aaa-256e6ff68236"). InnerVolumeSpecName "kube-api-access-5hjpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.283960 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.421129 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" path="/var/lib/kubelet/pods/5a806156-ca3b-43dd-8b19-c072188004b7/volumes" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.423710 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" path="/var/lib/kubelet/pods/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9/volumes" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.726857 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerDied","Data":"ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1"} Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.726910 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.728291 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.729979 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.748309 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.091565 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:28 crc kubenswrapper[4860]: E0320 10:59:28.092161 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092189 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: E0320 10:59:28.092208 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092251 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092406 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092447 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.093011 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.103528 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.104342 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.105060 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.115914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.117033 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.117708 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.121192 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.130567 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222987 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.223010 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.223043 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324839 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324898 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.325002 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.325048 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.326038 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.326555 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.327178 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.342672 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.346017 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.429296 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.744916 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerStarted","Data":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.749802 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.774984 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.781781 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9xlp" podStartSLOduration=2.455059678 podStartE2EDuration="57.781757975s" podCreationTimestamp="2026-03-20 10:58:32 +0000 UTC" firstStartedPulling="2026-03-20 10:58:33.978484559 +0000 UTC m=+238.199845457" lastFinishedPulling="2026-03-20 10:59:29.305182856 +0000 UTC m=+293.526543754" observedRunningTime="2026-03-20 10:59:29.769894929 +0000 UTC m=+293.991255857" watchObservedRunningTime="2026-03-20 10:59:29.781757975 +0000 UTC m=+294.003118873" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.757585 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" exitCode=0 Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.757683 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.765924 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerStarted","Data":"411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.765967 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerStarted","Data":"b8dcffcb468a289a993f7232cdab0940ede87b61eba385b847779a075838b8d1"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.766203 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.781535 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.802954 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" podStartSLOduration=6.80292667 podStartE2EDuration="6.80292667s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:30.801761117 +0000 UTC m=+295.023122015" watchObservedRunningTime="2026-03-20 10:59:30.80292667 +0000 UTC m=+295.024287568" Mar 20 10:59:31 crc kubenswrapper[4860]: I0320 10:59:31.236522 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.399894 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.462207 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.462418 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.785682 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.795061 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca" exitCode=0 Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.795148 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.801621 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.806571 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.811542 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.778901 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:33 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:33 crc kubenswrapper[4860]: > Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.820331 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.820413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.823686 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.826767 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.826867 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.837825 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.837963 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.855129 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.855197 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.870301 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5w95" podStartSLOduration=4.472739806 podStartE2EDuration="1m3.870261727s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.964598797 +0000 UTC m=+237.185959695" lastFinishedPulling="2026-03-20 10:59:32.362120718 +0000 UTC m=+296.583481616" observedRunningTime="2026-03-20 10:59:33.865497472 +0000 UTC m=+298.086858390" watchObservedRunningTime="2026-03-20 10:59:33.870261727 +0000 UTC m=+298.091622635" Mar 20 10:59:35 crc kubenswrapper[4860]: I0320 10:59:35.869795 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerStarted","Data":"7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234"} Mar 20 10:59:35 crc kubenswrapper[4860]: I0320 10:59:35.894560 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6sh9" podStartSLOduration=3.780842911 podStartE2EDuration="1m5.894518046s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.783419718 +0000 UTC m=+237.004780616" lastFinishedPulling="2026-03-20 10:59:34.897094853 +0000 UTC m=+299.118455751" observedRunningTime="2026-03-20 10:59:35.889381351 +0000 UTC m=+300.110742279" watchObservedRunningTime="2026-03-20 10:59:35.894518046 +0000 UTC m=+300.115878944" Mar 20 10:59:40 crc kubenswrapper[4860]: I0320 10:59:40.984090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:40 crc kubenswrapper[4860]: I0320 10:59:40.984883 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.063056 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.181146 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.181216 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.227999 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.913514 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a"} Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.916405 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.965415 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.980967 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.524032 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.574082 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.649699 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.926503 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" exitCode=0 Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.927073 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.970837 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jpww" podStartSLOduration=4.484042726 podStartE2EDuration="1m10.97080895s" podCreationTimestamp="2026-03-20 10:58:32 +0000 UTC" firstStartedPulling="2026-03-20 10:58:33.988841558 +0000 UTC m=+238.210202456" lastFinishedPulling="2026-03-20 10:59:40.475607782 +0000 UTC m=+304.696968680" observedRunningTime="2026-03-20 10:59:42.968131074 +0000 UTC m=+307.189491972" watchObservedRunningTime="2026-03-20 10:59:42.97080895 +0000 UTC m=+307.192169858" Mar 20 10:59:43 crc kubenswrapper[4860]: I0320 10:59:43.932989 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" containerID="cri-o://7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" gracePeriod=2 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.133692 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.134024 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" containerID="cri-o://411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.234044 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.234332 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" containerID="cri-o://7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.941932 4860 generic.go:334] "Generic (PLEG): container finished" podID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerID="411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" exitCode=0 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.942166 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerDied","Data":"411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec"} Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.751476 4860 patch_prober.go:28] interesting pod/route-controller-manager-6988ff758f-g2fv4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.751571 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.969411 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" exitCode=0 Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.969516 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234"} Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.971989 4860 generic.go:334] "Generic (PLEG): container finished" podID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerID="7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" exitCode=0 Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.972036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerDied","Data":"7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798"} Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.929576 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964430 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:46 crc kubenswrapper[4860]: E0320 10:59:46.964738 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964756 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964904 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.965426 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.984793 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerDied","Data":"b8dcffcb468a289a993f7232cdab0940ede87b61eba385b847779a075838b8d1"} Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990573 4860 scope.go:117] "RemoveContainer" containerID="411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990744 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.995159 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041084 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041209 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041347 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041473 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041719 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041768 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041800 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041851 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041924 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.042209 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config" (OuterVolumeSpecName: "config") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.043434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca" (OuterVolumeSpecName: "client-ca") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.044463 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.050717 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.053481 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj" (OuterVolumeSpecName: "kube-api-access-tmgqj") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "kube-api-access-tmgqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.143598 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148329 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150446 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150060 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150650 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.149429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150667 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150679 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150689 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150704 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.151953 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.165853 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.170285 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.234862 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.261383 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n79b7" podStartSLOduration=5.804290566 podStartE2EDuration="1m17.261358148s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.765814558 +0000 UTC m=+236.987175456" lastFinishedPulling="2026-03-20 10:59:44.22288215 +0000 UTC m=+308.444243038" observedRunningTime="2026-03-20 10:59:47.023338361 +0000 UTC m=+311.244699259" watchObservedRunningTime="2026-03-20 10:59:47.261358148 +0000 UTC m=+311.482719036" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.281760 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.285755 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.326003 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.328648 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.352950 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353112 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353183 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353830 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities" (OuterVolumeSpecName: "utilities") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.356657 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr" (OuterVolumeSpecName: "kube-api-access-77rcr") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "kube-api-access-77rcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.408832 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.422397 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" path="/var/lib/kubelet/pods/d61e9234-ffb5-44de-b42c-3c4a3028a994/volumes" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454452 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454525 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454565 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454994 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.455019 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.455029 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.457494 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config" (OuterVolumeSpecName: "config") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.457918 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.459614 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p" (OuterVolumeSpecName: "kube-api-access-mht7p") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "kube-api-access-mht7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.462134 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581872 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581920 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581934 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581949 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004439 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"a0b846fa7e38edf968a2ecba37cb073cb9550be5be8e0d139e448911ef2ef8dd"} Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004495 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004536 4860 scope.go:117] "RemoveContainer" containerID="7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.010646 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerDied","Data":"e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98"} Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.010783 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.033356 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.040625 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.047622 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.050509 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.390127 4860 scope.go:117] "RemoveContainer" containerID="c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.446195 4860 scope.go:117] "RemoveContainer" containerID="e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.479623 4860 scope.go:117] "RemoveContainer" containerID="7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.661190 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:48 crc kubenswrapper[4860]: W0320 10:59:48.677540 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a1c3ede_cf8a_4fc1_9a11_642b5546723c.slice/crio-c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54 WatchSource:0}: Error finding container c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54: Status 404 returned error can't find the container with id c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54 Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.021670 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" event={"ID":"3a1c3ede-cf8a-4fc1-9a11-642b5546723c","Type":"ContainerStarted","Data":"c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54"} Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104707 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104944 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-utilities" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104958 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-utilities" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104971 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104977 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104986 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104992 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.105009 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-content" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105014 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-content" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105116 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105129 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105599 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108416 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108703 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108965 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109116 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109332 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109355 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.113509 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213125 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213630 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213657 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213679 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314609 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314633 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314659 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.315819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.315927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.325169 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.337085 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.420908 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" path="/var/lib/kubelet/pods/7b622f82-e01c-42b8-8061-16b6e8f551fb/volumes" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.421858 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" path="/var/lib/kubelet/pods/dd77d1cf-d25f-459c-95b4-96c63acd0462/volumes" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.424763 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.694011 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030015 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" event={"ID":"a7b77bd7-65cb-402d-a918-3b3e8457b656","Type":"ContainerStarted","Data":"ef27fae7a5852a1262f2e378fa238cc0332680623e5aec34999ac8b5d2ec62a0"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030542 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" event={"ID":"a7b77bd7-65cb-402d-a918-3b3e8457b656","Type":"ContainerStarted","Data":"9ea70187f1142ba967548126a5cae21547c6b783bca186f712cd7eea24603a59"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030575 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.035548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.039776 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.043601 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.045617 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" event={"ID":"3a1c3ede-cf8a-4fc1-9a11-642b5546723c","Type":"ContainerStarted","Data":"697843f7d5d53f9ae0d486b963ad3259397c8e7cb3a46cd16fd19db2821f91d0"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.046375 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.053099 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.055574 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" podStartSLOduration=6.055550101 podStartE2EDuration="6.055550101s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:50.054066289 +0000 UTC m=+314.275427197" watchObservedRunningTime="2026-03-20 10:59:50.055550101 +0000 UTC m=+314.276910999" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.082907 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qq8bh" podStartSLOduration=3.765153021 podStartE2EDuration="1m17.082880146s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="2026-03-20 10:58:35.072532412 +0000 UTC m=+239.293893310" lastFinishedPulling="2026-03-20 10:59:48.390259537 +0000 UTC m=+312.611620435" observedRunningTime="2026-03-20 10:59:50.076009451 +0000 UTC m=+314.297370369" watchObservedRunningTime="2026-03-20 10:59:50.082880146 +0000 UTC m=+314.304241054" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.096600 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" podStartSLOduration=6.096575424 podStartE2EDuration="6.096575424s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:50.094598918 +0000 UTC m=+314.315959836" watchObservedRunningTime="2026-03-20 10:59:50.096575424 +0000 UTC m=+314.317936322" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.117340 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7ckk" podStartSLOduration=5.899908142 podStartE2EDuration="1m20.117309631s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.998454569 +0000 UTC m=+237.219815467" lastFinishedPulling="2026-03-20 10:59:47.215856058 +0000 UTC m=+311.437216956" observedRunningTime="2026-03-20 10:59:50.11301918 +0000 UTC m=+314.334380078" watchObservedRunningTime="2026-03-20 10:59:50.117309631 +0000 UTC m=+314.338670529" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.545351 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.570710 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jx27x" podStartSLOduration=4.569617128 podStartE2EDuration="1m17.570684273s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="2026-03-20 10:58:35.082815838 +0000 UTC m=+239.304176736" lastFinishedPulling="2026-03-20 10:59:48.083882983 +0000 UTC m=+312.305243881" observedRunningTime="2026-03-20 10:59:50.141210669 +0000 UTC m=+314.362571567" watchObservedRunningTime="2026-03-20 10:59:50.570684273 +0000 UTC m=+314.792045171" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.764425 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.764489 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.822708 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.093732 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.167432 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.167484 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.213491 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:52 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:52 crc kubenswrapper[4860]: > Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.863585 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.864051 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.902086 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.096288 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.660984 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.662277 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.079006 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.079456 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.699412 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:54 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:54 crc kubenswrapper[4860]: > Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.047983 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.074890 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" containerID="cri-o://e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" gracePeriod=2 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.117413 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:55 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:55 crc kubenswrapper[4860]: > Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.995181 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.996606 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.996902 4860 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997483 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997489 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997542 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997500 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" gracePeriod=15 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999307 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999481 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999501 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999515 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999525 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999534 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999541 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999555 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999562 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999572 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999581 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999596 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999604 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999614 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999622 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999634 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999641 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999779 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999807 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999815 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999821 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999830 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999839 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999849 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999859 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999999 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000014 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000159 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.000328 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000339 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092498 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" exitCode=0 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092547 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a"} Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092577 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b"} Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092588 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.098574 4860 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.105421 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.106004 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.106170 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130094 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130153 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130174 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130195 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130247 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130325 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130355 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130384 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231636 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231782 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231816 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232240 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232335 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232375 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232402 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232438 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232483 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232438 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232448 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232418 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232639 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232721 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232805 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.233351 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities" (OuterVolumeSpecName: "utilities") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.237901 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j" (OuterVolumeSpecName: "kube-api-access-rln5j") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "kube-api-access-rln5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.255434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.283876 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" containerID="cri-o://7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" gracePeriod=15 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334880 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334930 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334941 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.399842 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: W0320 10:59:56.424258 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7 WatchSource:0}: Error finding container b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7: Status 404 returned error can't find the container with id b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7 Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.427966 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8799770a1bd4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,LastTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.746487 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.747744 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.748331 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.748705 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842114 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842273 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842293 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842428 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842504 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842559 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842643 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842705 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842776 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842823 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842960 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843048 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843326 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843378 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843757 4860 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843801 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843831 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.844120 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.844788 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.846837 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.847341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.847875 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.848137 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.848147 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc" (OuterVolumeSpecName: "kube-api-access-rqnwc") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "kube-api-access-rqnwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850107 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850870 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850950 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.851118 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945792 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945835 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945854 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945870 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945888 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945903 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945917 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945931 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945946 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945960 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945972 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103407 4860 generic.go:334] "Generic (PLEG): container finished" podID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103528 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerDied","Data":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103573 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerDied","Data":"366c71d2561bff010f4d5dff91d7764636b34e8d53c1f0235c50a2b7eb65710b"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103602 4860 scope.go:117] "RemoveContainer" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.104328 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.106156 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.106647 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.107197 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.107588 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.109437 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110312 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110337 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110347 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110359 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" exitCode=2 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.112564 4860 generic.go:334] "Generic (PLEG): container finished" podID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerID="8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.112636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerDied","Data":"8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113513 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113705 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113849 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113993 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125023 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125307 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125367 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7"} Mar 20 10:59:57 crc kubenswrapper[4860]: E0320 10:59:57.127032 4860 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.128456 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.128831 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.129195 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.129858 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.130439 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.130827 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.131158 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.131616 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.132578 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.133123 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.133611 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.134010 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.135614 4860 scope.go:117] "RemoveContainer" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: E0320 10:59:57.138693 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": container with ID starting with 7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186 not found: ID does not exist" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.138802 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} err="failed to get container status \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": rpc error: code = NotFound desc = could not find container \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": container with ID starting with 7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186 not found: ID does not exist" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.138853 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.139996 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140192 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140483 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140862 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.419531 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.420628 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.421285 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.422048 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.147453 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.381925 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.383967 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.384625 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385130 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385325 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385471 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.532075 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.532740 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533023 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533372 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533644 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.570630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.570775 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571316 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571551 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571431 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571591 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572466 4860 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572611 4860 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572728 4860 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674493 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674570 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674562 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock" (OuterVolumeSpecName: "var-lock") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674645 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674996 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.675028 4860 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.680869 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.776953 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155705 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerDied","Data":"e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386"} Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155758 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155762 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.161419 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162570 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" exitCode=0 Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162634 4860 scope.go:117] "RemoveContainer" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162699 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.169875 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.170662 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.171173 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.171440 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.177836 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178046 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178282 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178604 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.199545 4860 scope.go:117] "RemoveContainer" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.218343 4860 scope.go:117] "RemoveContainer" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.231818 4860 scope.go:117] "RemoveContainer" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.251164 4860 scope.go:117] "RemoveContainer" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.270986 4860 scope.go:117] "RemoveContainer" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.294864 4860 scope.go:117] "RemoveContainer" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.295414 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": container with ID starting with 2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14 not found: ID does not exist" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.295545 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14"} err="failed to get container status \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": rpc error: code = NotFound desc = could not find container \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": container with ID starting with 2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.295641 4860 scope.go:117] "RemoveContainer" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296215 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": container with ID starting with 7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392 not found: ID does not exist" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296280 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392"} err="failed to get container status \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": rpc error: code = NotFound desc = could not find container \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": container with ID starting with 7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296306 4860 scope.go:117] "RemoveContainer" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296612 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": container with ID starting with 59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017 not found: ID does not exist" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296640 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017"} err="failed to get container status \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": rpc error: code = NotFound desc = could not find container \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": container with ID starting with 59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296658 4860 scope.go:117] "RemoveContainer" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296998 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": container with ID starting with e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e not found: ID does not exist" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.297109 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e"} err="failed to get container status \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": rpc error: code = NotFound desc = could not find container \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": container with ID starting with e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.297540 4860 scope.go:117] "RemoveContainer" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.298371 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": container with ID starting with b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433 not found: ID does not exist" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298402 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433"} err="failed to get container status \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": rpc error: code = NotFound desc = could not find container \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": container with ID starting with b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298418 4860 scope.go:117] "RemoveContainer" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.298722 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": container with ID starting with a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4 not found: ID does not exist" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298748 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4"} err="failed to get container status \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": rpc error: code = NotFound desc = could not find container \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": container with ID starting with a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.437044 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.491049 4860 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" volumeName="registry-storage" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.964653 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.965663 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.965937 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966173 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966457 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: I0320 11:00:00.966490 4860 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966824 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 20 11:00:01 crc kubenswrapper[4860]: E0320 11:00:01.170163 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.207819 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.208577 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.208842 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.209155 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.209545 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.247241 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.247737 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.248158 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.248518 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.249029 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: E0320 11:00:01.571914 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 20 11:00:02 crc kubenswrapper[4860]: E0320 11:00:02.373024 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.697593 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.698479 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.698918 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699129 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699347 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699701 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.731763 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732329 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732546 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732850 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.733411 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.733584 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: E0320 11:00:03.974704 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.121966 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.122837 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.123184 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.123701 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124082 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124407 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124728 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.156509 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.157251 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.157627 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158100 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158359 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158617 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158878 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:05 crc kubenswrapper[4860]: E0320 11:00:05.658164 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8799770a1bd4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,LastTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 11:00:07 crc kubenswrapper[4860]: E0320 11:00:07.175978 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="6.4s" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.417334 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.417940 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.418448 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.418853 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.419283 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.419548 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.241713 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.243970 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244058 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" exitCode=1 Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244116 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3"} Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244914 4860 scope.go:117] "RemoveContainer" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.245294 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.245830 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.246290 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.246856 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247128 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247527 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247841 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.366478 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.412421 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.413579 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.414119 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.414759 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415288 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415651 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415929 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.416363 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441064 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441089 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:10 crc kubenswrapper[4860]: E0320 11:00:10.441418 4860 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441835 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: W0320 11:00:10.459634 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f WatchSource:0}: Error finding container 2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f: Status 404 returned error can't find the container with id 2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.256353 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.259094 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.259212 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.260793 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261073 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261569 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261875 4860 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="783124f64419143d0101e576dce78700db95db48469864c2940162caae521b15" exitCode=0 Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261907 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"783124f64419143d0101e576dce78700db95db48469864c2940162caae521b15"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261925 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262091 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262198 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262218 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262428 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: E0320 11:00:11.262509 4860 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262886 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.263469 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.263881 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264097 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264306 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264552 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264856 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.265153 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.265412 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.292824 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f7c77282f274e0123893235d280f7fbcdff5c0ff3e6ac1721888c21de20119ce"} Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.293269 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8fb6b49c461b4613d59d87fe47cecaa238e1f4a8609211f65cccb46de6ca211"} Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.293286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb29e37c0c3ac7d25e540a15f333f21f7de994198712f6ad16c75dbd9d4a97dc"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63cf051bc188fb176018b83ebc88fa9d85730fe6418fdf7fd49c1e07a97c91d6"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303530 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"37e66cee46ff35dd813a124de744e22432e2a3b9bced1ce2fbc404db4daa6386"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.304522 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.304662 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438318 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438608 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438665 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442854 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442907 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442925 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442966 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.443114 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.443732 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443259 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443803 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443851 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.443817694 +0000 UTC m=+461.665178592 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443291 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443670 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443940 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.443904256 +0000 UTC m=+461.665265194 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.447330 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.449889 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454435 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454463 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454524 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.454505175 +0000 UTC m=+461.675866073 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454547 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.454538566 +0000 UTC m=+461.675899674 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:18 crc kubenswrapper[4860]: I0320 11:00:18.320606 4860 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:18 crc kubenswrapper[4860]: I0320 11:00:18.510707 4860 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8d2e5ccd-eb92-4cfd-95cc-574044b7a8cd" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.342101 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.342153 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.346553 4860 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8d2e5ccd-eb92-4cfd-95cc-574044b7a8cd" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.447190 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.448423 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.448788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 11:00:20 crc kubenswrapper[4860]: I0320 11:00:20.366677 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:21 crc kubenswrapper[4860]: E0320 11:00:21.441887 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 11:00:21 crc kubenswrapper[4860]: E0320 11:00:21.449318 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 11:00:22 crc kubenswrapper[4860]: E0320 11:00:22.449968 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 11:00:23 crc kubenswrapper[4860]: I0320 11:00:23.439089 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:23 crc kubenswrapper[4860]: I0320 11:00:23.439646 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:27 crc kubenswrapper[4860]: I0320 11:00:27.692339 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.547721 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.564354 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.766909 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.859152 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.982062 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.136419 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.148171 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.227590 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.267518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.280462 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.403641 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.899057 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.173450 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.327279 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.350393 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.428295 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.576737 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.600917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.646825 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.706445 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.764261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.964185 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.997830 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.021800 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.043394 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.460579 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.476701 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.521050 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.644210 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.773263 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.822647 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.882420 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.886429 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.934345 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.986423 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.066206 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.148495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.301030 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.333403 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.358737 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.446927 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.503947 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.519598 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.519819 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.525336 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.526428 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.577997 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.595770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.597101 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.668638 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.902112 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.904925 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.908749 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.997162 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.074906 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.215528 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.225170 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.277697 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438414 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438478 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438537 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.439119 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.439259 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd" gracePeriod=30 Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.460039 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.568410 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.650336 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.658292 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.688895 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.709400 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.711946 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.719493 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.726069 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.751655 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.776098 4860 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.781719 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.790016 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.797095 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.799063 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.851569 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.857765 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.874870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.912553 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.929862 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.943074 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.959190 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.968315 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.981700 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.991535 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.036500 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.108886 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.401299 4860 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.412978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.418799 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.419662 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.451612 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.549986 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.582325 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.610665 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.614961 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.657160 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.712983 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.726525 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.778685 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.945570 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.947878 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.038179 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.059774 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.102443 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.211584 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.215704 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.263956 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.334852 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.396269 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.412888 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.538526 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.563436 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.666408 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.702012 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.831965 4860 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.840006 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-marketplace-5jpww","openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.840081 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.845068 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.848253 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.865829 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.865801612 podStartE2EDuration="17.865801612s" podCreationTimestamp="2026-03-20 11:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:35.861045397 +0000 UTC m=+360.082406305" watchObservedRunningTime="2026-03-20 11:00:35.865801612 +0000 UTC m=+360.087162510" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.918710 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.951483 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.957095 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.019124 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.036076 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.060056 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.084604 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.142137 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.323505 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.461069 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.476942 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.523265 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.538519 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.615130 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.651595 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.750757 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.755304 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.836803 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.853461 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.909817 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.922507 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.039555 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.057372 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.088527 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.096454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.108329 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.128001 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.164330 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.204724 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.248253 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.346679 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.391065 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.406141 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.413367 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.420904 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" path="/var/lib/kubelet/pods/2268b7ae-c1db-4ef4-8236-60f7cfa277a1/volumes" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.422258 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" path="/var/lib/kubelet/pods/3587f3ba-577b-425a-adf5-336a8977dcc5/volumes" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.590560 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.636662 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.695071 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.760142 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.840055 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.959852 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.993571 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.076269 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.082967 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.096105 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.123523 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.152634 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.232272 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.288402 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.300917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.348862 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.618805 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.700347 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.720608 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.739110 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.741948 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.887761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.924981 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.937780 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.011488 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.102656 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.188366 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.280723 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.288137 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.337775 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.373088 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.381343 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.467455 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.494454 4860 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.531481 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.549363 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.571598 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.608334 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.665521 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.766355 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.771008 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.783899 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.899439 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.905167 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.095474 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.103858 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.138952 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.145907 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.164770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.166770 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.196600 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.442511 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.733547 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.784679 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.818086 4860 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.818413 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" gracePeriod=5 Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.826894 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.980800 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.025495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.054215 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.090160 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.128773 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.221098 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.300024 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.355940 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.376511 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.409434 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.537116 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.665879 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.672419 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.721019 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.769375 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.934454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.167253 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.353958 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.485866 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.603150 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.720253 4860 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.767153 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.816080 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.857706 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.926550 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.960760 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.014793 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.132025 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.233938 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.665287 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493654 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493898 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493912 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493923 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-utilities" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493930 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-utilities" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493949 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493956 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493965 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-content" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493970 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-content" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493979 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493985 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493993 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494000 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494088 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494099 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494112 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494118 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494536 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.496719 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.496941 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.497770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498646 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498829 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498986 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.499130 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.503211 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.505309 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511171 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511347 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511539 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511659 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514277 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514348 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514377 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514404 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514433 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514464 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514490 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514528 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514568 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514616 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514646 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514725 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514763 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514796 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.517651 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.518838 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.533147 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616304 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616367 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616412 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616439 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616535 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617730 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617754 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617942 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618057 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618096 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618123 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618181 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632476 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632542 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632589 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632652 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.633298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.635966 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636260 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636707 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636749 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.637899 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641445 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641545 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641974 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.642902 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.643104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.659834 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.823184 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.932238 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.153261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.228619 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.499663 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" event={"ID":"0f06028e-1b3c-4890-857c-4f45971b09e2","Type":"ContainerStarted","Data":"c4b23f985c72e603416bf3ef53615b0f51cd9bb142bf92c48b86f3e167cc39e6"} Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.500957 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.500979 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" event={"ID":"0f06028e-1b3c-4890-857c-4f45971b09e2","Type":"ContainerStarted","Data":"1f2bbffe8baa0b74a42d4a12f23036f4ae087c8ae5f3f27235e6d19fe322ae6b"} Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.502487 4860 patch_prober.go:28] interesting pod/oauth-openshift-6f8f59f8d9-5xxts container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" start-of-body= Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.502534 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" podUID="0f06028e-1b3c-4890-857c-4f45971b09e2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.517309 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" podStartSLOduration=74.517284842 podStartE2EDuration="1m14.517284842s" podCreationTimestamp="2026-03-20 10:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:45.515746738 +0000 UTC m=+369.737107636" watchObservedRunningTime="2026-03-20 11:00:45.517284842 +0000 UTC m=+369.738645740" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.398781 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.398904 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460539 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460621 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460793 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460839 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460925 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460918 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460825 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461046 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461142 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461687 4860 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461713 4860 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461732 4860 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461748 4860 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.472319 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.509111 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510461 4860 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" exitCode=137 Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510538 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510567 4860 scope.go:117] "RemoveContainer" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.517440 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.540378 4860 scope.go:117] "RemoveContainer" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: E0320 11:00:46.541635 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": container with ID starting with 6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb not found: ID does not exist" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.541677 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb"} err="failed to get container status \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": rpc error: code = NotFound desc = could not find container \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": container with ID starting with 6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb not found: ID does not exist" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.564049 4860 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:47 crc kubenswrapper[4860]: I0320 11:00:47.421858 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.608601 4860 generic.go:334] "Generic (PLEG): container finished" podID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" exitCode=0 Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.608706 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.609822 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.615947 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.616754 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.618559 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.623762 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.624661 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626201 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626270 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd" exitCode=137 Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626394 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626499 4860 scope.go:117] "RemoveContainer" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.635866 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.637265 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.638434 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c10a79b01ed3c1a1c40f224371e5b1990c85d54c1c4aec9018ecd4245f9d69fc"} Mar 20 11:01:10 crc kubenswrapper[4860]: I0320 11:01:10.366665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:13 crc kubenswrapper[4860]: I0320 11:01:13.438007 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:13 crc kubenswrapper[4860]: I0320 11:01:13.444488 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:20 crc kubenswrapper[4860]: I0320 11:01:20.370515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.178350 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.179748 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.182493 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.182751 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.193771 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.215698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.234788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.237087 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.241708 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.244545 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.255475 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309764 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309860 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309919 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309947 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411083 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411144 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411207 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.412578 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.422931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.434986 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.447573 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.544043 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.560510 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.966652 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.012606 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:22 crc kubenswrapper[4860]: W0320 11:01:22.017154 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a09ead_8137_4791_896c_c5a9cad7f4cf.slice/crio-95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951 WatchSource:0}: Error finding container 95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951: Status 404 returned error can't find the container with id 95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951 Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.768096 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerStarted","Data":"5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a"} Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.771900 4860 generic.go:334] "Generic (PLEG): container finished" podID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerID="728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c" exitCode=0 Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.771959 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerDied","Data":"728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c"} Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.772000 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerStarted","Data":"95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951"} Mar 20 11:01:23 crc kubenswrapper[4860]: I0320 11:01:23.778729 4860 generic.go:334] "Generic (PLEG): container finished" podID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerID="e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362" exitCode=0 Mar 20 11:01:23 crc kubenswrapper[4860]: I0320 11:01:23.778785 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerDied","Data":"e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362"} Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.039510 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153350 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153392 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.154820 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.163690 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.167066 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr" (OuterVolumeSpecName: "kube-api-access-sv5sr") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "kube-api-access-sv5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254603 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254642 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254652 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.789069 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.789273 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerDied","Data":"95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951"} Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.793869 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.032896 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.177127 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"b31d1240-ea69-4da9-9a40-70f252222d4d\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.183427 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr" (OuterVolumeSpecName: "kube-api-access-mlzmr") pod "b31d1240-ea69-4da9-9a40-70f252222d4d" (UID: "b31d1240-ea69-4da9-9a40-70f252222d4d"). InnerVolumeSpecName "kube-api-access-mlzmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.279579 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796470 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796504 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerDied","Data":"5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a"} Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796554 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.364431 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.365339 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" containerID="cri-o://1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" gracePeriod=2 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.570180 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.570573 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" containerID="cri-o://19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" gracePeriod=2 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.741784 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.838484 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.838959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.839068 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.839569 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities" (OuterVolumeSpecName: "utilities") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.844069 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s" (OuterVolumeSpecName: "kube-api-access-jhz7s") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "kube-api-access-jhz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.865121 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" exitCode=0 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.865150 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868113 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" exitCode=0 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868172 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868325 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"711ef831caa70569060ba2dc068e9cede6a21ca93c6a666bf7abd4f4e2156736"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868357 4860 scope.go:117] "RemoveContainer" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.896616 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.897089 4860 scope.go:117] "RemoveContainer" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.925511 4860 scope.go:117] "RemoveContainer" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.938657 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940315 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940339 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940349 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.946564 4860 scope.go:117] "RemoveContainer" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.947110 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": container with ID starting with 1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4 not found: ID does not exist" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947186 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} err="failed to get container status \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": rpc error: code = NotFound desc = could not find container \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": container with ID starting with 1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4 not found: ID does not exist" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947277 4860 scope.go:117] "RemoveContainer" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.947928 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": container with ID starting with a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6 not found: ID does not exist" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947976 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} err="failed to get container status \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": rpc error: code = NotFound desc = could not find container \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": container with ID starting with a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6 not found: ID does not exist" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.948006 4860 scope.go:117] "RemoveContainer" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.948357 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": container with ID starting with 19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2 not found: ID does not exist" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.948378 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2"} err="failed to get container status \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": rpc error: code = NotFound desc = could not find container \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": container with ID starting with 19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2 not found: ID does not exist" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041655 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041759 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041794 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.043115 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities" (OuterVolumeSpecName: "utilities") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.045495 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7" (OuterVolumeSpecName: "kube-api-access-mn9d7") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "kube-api-access-mn9d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.146221 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.146404 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.158314 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.200314 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.204275 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.247860 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877247 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604"} Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877305 4860 scope.go:117] "RemoveContainer" containerID="19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877343 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.907534 4860 scope.go:117] "RemoveContainer" containerID="5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.925601 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.932358 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.940317 4860 scope.go:117] "RemoveContainer" containerID="cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6" Mar 20 11:01:37 crc kubenswrapper[4860]: I0320 11:01:37.424969 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" path="/var/lib/kubelet/pods/f0e14a08-824b-450f-bf98-2a476da0d44b/volumes" Mar 20 11:01:37 crc kubenswrapper[4860]: I0320 11:01:37.426669 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" path="/var/lib/kubelet/pods/f81a43aa-2c39-4d49-8526-f097322dd7bf/volumes" Mar 20 11:01:52 crc kubenswrapper[4860]: I0320 11:01:52.344269 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:52 crc kubenswrapper[4860]: I0320 11:01:52.344964 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.144736 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146417 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146497 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146566 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146623 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146688 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146750 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146811 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146868 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146925 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146979 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147040 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147399 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147480 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147551 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147620 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147698 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147878 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147949 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148012 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148075 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148593 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.151972 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.152843 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.158631 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.158807 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.160402 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.262579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.289939 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.469843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:01 crc kubenswrapper[4860]: I0320 11:02:01.031833 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:02 crc kubenswrapper[4860]: I0320 11:02:02.032372 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerStarted","Data":"7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974"} Mar 20 11:02:03 crc kubenswrapper[4860]: I0320 11:02:03.043191 4860 generic.go:334] "Generic (PLEG): container finished" podID="980f5756-4935-469d-933b-f4e339ded9a4" containerID="5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46" exitCode=0 Mar 20 11:02:03 crc kubenswrapper[4860]: I0320 11:02:03.043932 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerDied","Data":"5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46"} Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.157667 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.159062 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" containerID="cri-o://bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.182585 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.182916 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" containerID="cri-o://8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.195187 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.195435 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" containerID="cri-o://71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.216950 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.217249 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" containerID="cri-o://3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.227405 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.227644 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" containerID="cri-o://dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.232200 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.233588 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.270790 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334093 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334191 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334283 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.436931 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.437556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.437611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.439510 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.448120 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.465048 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.608890 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.619710 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.642369 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"980f5756-4935-469d-933b-f4e339ded9a4\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.647466 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf" (OuterVolumeSpecName: "kube-api-access-h48qf") pod "980f5756-4935-469d-933b-f4e339ded9a4" (UID: "980f5756-4935-469d-933b-f4e339ded9a4"). InnerVolumeSpecName "kube-api-access-h48qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.691122 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.742786 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.743704 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745202 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745536 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745815 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.746544 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities" (OuterVolumeSpecName: "utilities") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.753618 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5" (OuterVolumeSpecName: "kube-api-access-6fks5") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "kube-api-access-6fks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855126 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855688 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.856120 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.856149 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.857062 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities" (OuterVolumeSpecName: "utilities") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.866415 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2" (OuterVolumeSpecName: "kube-api-access-4h2z2") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "kube-api-access-4h2z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.909395 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.937358 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.958570 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977715 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977788 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977922 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.978205 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.978217 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.980048 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities" (OuterVolumeSpecName: "utilities") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.993000 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.997912 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.009484 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg" (OuterVolumeSpecName: "kube-api-access-pdzkg") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "kube-api-access-pdzkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069773 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerDied","Data":"7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069821 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069882 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079722 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079797 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079830 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"39591daa264ba7bebe5fdc529015addd733110c1c54ed6b98d8a162a754e8d60"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079849 4860 scope.go:117] "RemoveContainer" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.080183 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081658 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081723 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081753 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081810 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081828 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081851 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082524 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082541 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082551 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082561 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.083673 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities" (OuterVolumeSpecName: "utilities") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.086485 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092005 4860 generic.go:334] "Generic (PLEG): container finished" podID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092131 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.093044 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p" (OuterVolumeSpecName: "kube-api-access-kg46p") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "kube-api-access-kg46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.093077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.100012 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf" (OuterVolumeSpecName: "kube-api-access-hqvlf") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "kube-api-access-hqvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.103497 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.103846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.104020 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"db09468d977aabd81ce312da99eaa8c50b25e5282affd310a612fbfda038e94c"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.104215 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.108607 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.110035 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114032 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114129 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114155 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"1f538c1360593e9a410b70b066b34c33f5665e2dac735a2212ce3b3dbdf2dce0"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114267 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 11:02:05 crc kubenswrapper[4860]: W0320 11:02:05.122806 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod489f9463_a47c_4635_aad3_866e47a2c97f.slice/crio-a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e WatchSource:0}: Error finding container a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e: Status 404 returned error can't find the container with id a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.123996 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.124284 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.124423 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.125866 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.133248 4860 scope.go:117] "RemoveContainer" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.134123 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.158841 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.186612 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.186982 4860 scope.go:117] "RemoveContainer" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190817 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190856 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190868 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190879 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190889 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190904 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.222317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.222964 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.222986 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223006 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223038 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223053 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223060 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223077 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223083 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223095 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223100 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223115 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223121 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223134 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223140 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223160 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223166 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223179 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223186 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223194 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223199 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223210 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223236 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223244 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223250 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223263 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223269 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223279 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223288 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223302 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223308 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223465 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223480 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223491 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223502 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223535 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223547 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.224242 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.242698 4860 scope.go:117] "RemoveContainer" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.243464 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": container with ID starting with 3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f not found: ID does not exist" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.243641 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} err="failed to get container status \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": rpc error: code = NotFound desc = could not find container \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": container with ID starting with 3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.243802 4860 scope.go:117] "RemoveContainer" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.244246 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": container with ID starting with 01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63 not found: ID does not exist" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244278 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63"} err="failed to get container status \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": rpc error: code = NotFound desc = could not find container \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": container with ID starting with 01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244301 4860 scope.go:117] "RemoveContainer" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.244702 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": container with ID starting with 3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac not found: ID does not exist" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244749 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac"} err="failed to get container status \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": rpc error: code = NotFound desc = could not find container \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": container with ID starting with 3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244776 4860 scope.go:117] "RemoveContainer" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.248024 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.250831 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.281060 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.288588 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294062 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294127 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294204 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294256 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294300 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294325 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294348 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294386 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294443 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.309928 4860 scope.go:117] "RemoveContainer" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.311039 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": container with ID starting with 71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143 not found: ID does not exist" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311102 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} err="failed to get container status \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": rpc error: code = NotFound desc = could not find container \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": container with ID starting with 71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311146 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.311707 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": container with ID starting with 857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9 not found: ID does not exist" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311778 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} err="failed to get container status \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": rpc error: code = NotFound desc = could not find container \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": container with ID starting with 857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311827 4860 scope.go:117] "RemoveContainer" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.314237 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.324646 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.328259 4860 scope.go:117] "RemoveContainer" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.374614 4860 scope.go:117] "RemoveContainer" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388304 4860 scope.go:117] "RemoveContainer" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.388800 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": container with ID starting with dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958 not found: ID does not exist" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388849 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} err="failed to get container status \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": rpc error: code = NotFound desc = could not find container \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": container with ID starting with dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388881 4860 scope.go:117] "RemoveContainer" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.389650 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": container with ID starting with 844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22 not found: ID does not exist" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.389687 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} err="failed to get container status \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": rpc error: code = NotFound desc = could not find container \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": container with ID starting with 844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.389721 4860 scope.go:117] "RemoveContainer" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.390168 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": container with ID starting with 62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847 not found: ID does not exist" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.390237 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847"} err="failed to get container status \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": rpc error: code = NotFound desc = could not find container \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": container with ID starting with 62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.390279 4860 scope.go:117] "RemoveContainer" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.395988 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396059 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396132 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396175 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396247 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396269 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396302 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396970 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.397479 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.398844 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.402104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.402845 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.409208 4860 scope.go:117] "RemoveContainer" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.413581 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.417499 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.423510 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" path="/var/lib/kubelet/pods/4f84f111-5991-4e78-9508-82283b8e36f7/volumes" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.424411 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" path="/var/lib/kubelet/pods/d2690d8b-c7f7-4e71-af44-33444e4d6187/volumes" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.448947 4860 scope.go:117] "RemoveContainer" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.466408 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.470492 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.497417 4860 scope.go:117] "RemoveContainer" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.497546 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.501346 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": container with ID starting with bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff not found: ID does not exist" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.501379 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} err="failed to get container status \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": rpc error: code = NotFound desc = could not find container \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": container with ID starting with bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.501431 4860 scope.go:117] "RemoveContainer" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.506753 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": container with ID starting with 1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532 not found: ID does not exist" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.506805 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} err="failed to get container status \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": rpc error: code = NotFound desc = could not find container \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": container with ID starting with 1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.506854 4860 scope.go:117] "RemoveContainer" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.507822 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": container with ID starting with 6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe not found: ID does not exist" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.507886 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe"} err="failed to get container status \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": rpc error: code = NotFound desc = could not find container \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": container with ID starting with 6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.507934 4860 scope.go:117] "RemoveContainer" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.510460 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.514320 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.517125 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.565846 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.566864 4860 scope.go:117] "RemoveContainer" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.597981 4860 scope.go:117] "RemoveContainer" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.618898 4860 scope.go:117] "RemoveContainer" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.619845 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": container with ID starting with 8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4 not found: ID does not exist" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.619885 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} err="failed to get container status \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": rpc error: code = NotFound desc = could not find container \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": container with ID starting with 8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.619923 4860 scope.go:117] "RemoveContainer" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.620422 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": container with ID starting with dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a not found: ID does not exist" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620456 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} err="failed to get container status \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": rpc error: code = NotFound desc = could not find container \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": container with ID starting with dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620506 4860 scope.go:117] "RemoveContainer" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.620794 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": container with ID starting with fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d not found: ID does not exist" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620824 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d"} err="failed to get container status \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": rpc error: code = NotFound desc = could not find container \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": container with ID starting with fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.796124 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.154885 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" event={"ID":"09980524-2db0-4279-8e7c-09d82081be4b","Type":"ContainerStarted","Data":"bf928a9e95ea76a816887b1b1988b6034b7e02a96cd7b5632d3929848377094e"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.158837 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" event={"ID":"489f9463-a47c-4635-aad3-866e47a2c97f","Type":"ContainerStarted","Data":"d7908b32dfb6ed12940916e4e1523db0b8b1a86b415031e69430c3cc2102f94f"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.158890 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" event={"ID":"489f9463-a47c-4635-aad3-866e47a2c97f","Type":"ContainerStarted","Data":"a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.159511 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.166804 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.178680 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" podStartSLOduration=2.178650474 podStartE2EDuration="2.178650474s" podCreationTimestamp="2026-03-20 11:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:06.17741811 +0000 UTC m=+450.398779018" watchObservedRunningTime="2026-03-20 11:02:06.178650474 +0000 UTC m=+450.400011372" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.974597 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.974982 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.975896 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.978445 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.988404 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.020760 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.021014 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.021198 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123582 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123934 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123997 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.124944 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.125662 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.148675 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.172642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" event={"ID":"09980524-2db0-4279-8e7c-09d82081be4b","Type":"ContainerStarted","Data":"7645810f63db6d98f89257c6ff5de9d6f512d0b45162b05f786543ac22180372"} Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.172917 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.204210 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" podStartSLOduration=2.204172965 podStartE2EDuration="2.204172965s" podCreationTimestamp="2026-03-20 11:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:07.194909628 +0000 UTC m=+451.416270526" watchObservedRunningTime="2026-03-20 11:02:07.204172965 +0000 UTC m=+451.425533863" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.292115 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.427034 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" path="/var/lib/kubelet/pods/403ca5f6-bd52-40de-88d6-5151b3202c76/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.427679 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" path="/var/lib/kubelet/pods/514f05c3-1404-46c6-9f4d-68437ea8ee0b/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.428389 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" path="/var/lib/kubelet/pods/f20cb95e-5480-4c9c-859f-0b03d679ab06/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.517343 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:07 crc kubenswrapper[4860]: W0320 11:02:07.531735 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d34a762_55ad_41cb_994e_d4707bfebe22.slice/crio-9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3 WatchSource:0}: Error finding container 9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3: Status 404 returned error can't find the container with id 9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3 Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.984779 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.986713 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.989203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044121 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044250 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044296 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.045946 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145629 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145725 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145804 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.146526 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.146655 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.167620 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.182951 4860 generic.go:334] "Generic (PLEG): container finished" podID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerID="bded5492155070d57a139dff89185221af2941a144e60a6b22a8e6eae05f55ca" exitCode=0 Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.183036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerDied","Data":"bded5492155070d57a139dff89185221af2941a144e60a6b22a8e6eae05f55ca"} Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.183104 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3"} Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.302599 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.536602 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191309 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5" exitCode=0 Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191455 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5"} Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191974 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"0985eda395c30cb4fc11c5a030b8aabf733cd8d60366ed8ecb07d45313940c24"} Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.382146 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.387307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.398389 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.408356 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466745 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466795 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466852 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568172 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568267 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568529 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568945 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.569183 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.588068 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.723924 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.980675 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.200843 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.203465 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205185 4860 generic.go:334] "Generic (PLEG): container finished" podID="a3a77828-39d7-4547-ba09-26a9a0fb8e7b" containerID="acb2012425b8eb6b69ac49a4cfcbb9f33c79aef77a6ddd96d2110907e1765fb8" exitCode=0 Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205254 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerDied","Data":"acb2012425b8eb6b69ac49a4cfcbb9f33c79aef77a6ddd96d2110907e1765fb8"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205276 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"c72cf96052172bb404f7f16a65c7ab617ea74b3d559d23fc67c862ea51f205b2"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.377317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.378524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.382062 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.395351 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484362 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484457 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484537 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585724 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585791 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585817 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.586551 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.586865 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.606537 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.693699 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.902895 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.216436 4860 generic.go:334] "Generic (PLEG): container finished" podID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerID="baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.216563 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerDied","Data":"baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.220690 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.233208 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.233329 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247359 4860 generic.go:334] "Generic (PLEG): container finished" podID="f820689f-28ee-4cbe-bf7b-049d9ec6ef64" containerID="825da8bd923b4ec65455cb3fc9a9543a1bb3dde599c10fcad2d8d0915679c73b" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247414 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerDied","Data":"825da8bd923b4ec65455cb3fc9a9543a1bb3dde599c10fcad2d8d0915679c73b"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247454 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerStarted","Data":"3e386c36a91903733f639f25b71736ac81412f01a413b43f4de17bbfd5d41ebb"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.257769 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"1441bba4ff54877f3670f5cf889c2f3f29ba6e9d0688c3c6feb2a2b3d60283bf"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.259599 4860 generic.go:334] "Generic (PLEG): container finished" podID="a3a77828-39d7-4547-ba09-26a9a0fb8e7b" containerID="cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a" exitCode=0 Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.259647 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerDied","Data":"cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.270805 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.274240 4860 generic.go:334] "Generic (PLEG): container finished" podID="f820689f-28ee-4cbe-bf7b-049d9ec6ef64" containerID="d9b383675be39b9fd832a1c5911831cc8302efe7e48e72c595029c9c5dc8b325" exitCode=0 Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.274315 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerDied","Data":"d9b383675be39b9fd832a1c5911831cc8302efe7e48e72c595029c9c5dc8b325"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.280523 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrj5v" podStartSLOduration=2.727340365 podStartE2EDuration="6.280499513s" podCreationTimestamp="2026-03-20 11:02:06 +0000 UTC" firstStartedPulling="2026-03-20 11:02:08.18630992 +0000 UTC m=+452.407670838" lastFinishedPulling="2026-03-20 11:02:11.739469088 +0000 UTC m=+455.960829986" observedRunningTime="2026-03-20 11:02:12.276565995 +0000 UTC m=+456.497926893" watchObservedRunningTime="2026-03-20 11:02:12.280499513 +0000 UTC m=+456.501860411" Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.327757 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2x6p" podStartSLOduration=2.586557625 podStartE2EDuration="5.327735343s" podCreationTimestamp="2026-03-20 11:02:07 +0000 UTC" firstStartedPulling="2026-03-20 11:02:09.264286844 +0000 UTC m=+453.485647742" lastFinishedPulling="2026-03-20 11:02:12.005464562 +0000 UTC m=+456.226825460" observedRunningTime="2026-03-20 11:02:12.326652973 +0000 UTC m=+456.548013881" watchObservedRunningTime="2026-03-20 11:02:12.327735343 +0000 UTC m=+456.549096241" Mar 20 11:02:13 crc kubenswrapper[4860]: I0320 11:02:13.302671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"1b3ec51cf2c526996dc1328276334f591b28f8d8fe3810e296d7cb4924676dee"} Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.311997 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerStarted","Data":"4fa395af66801548de446165dab28cdcfc56e658660bf981d9dc1587a6995b23"} Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.333831 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhds4" podStartSLOduration=2.174703475 podStartE2EDuration="4.333803016s" podCreationTimestamp="2026-03-20 11:02:10 +0000 UTC" firstStartedPulling="2026-03-20 11:02:11.249151709 +0000 UTC m=+455.470512607" lastFinishedPulling="2026-03-20 11:02:13.40825125 +0000 UTC m=+457.629612148" observedRunningTime="2026-03-20 11:02:14.333323643 +0000 UTC m=+458.554684571" watchObservedRunningTime="2026-03-20 11:02:14.333803016 +0000 UTC m=+458.555163914" Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.337241 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47qz8" podStartSLOduration=2.686000779 podStartE2EDuration="5.33721725s" podCreationTimestamp="2026-03-20 11:02:09 +0000 UTC" firstStartedPulling="2026-03-20 11:02:10.206850965 +0000 UTC m=+454.428211863" lastFinishedPulling="2026-03-20 11:02:12.858067436 +0000 UTC m=+457.079428334" observedRunningTime="2026-03-20 11:02:13.332675087 +0000 UTC m=+457.554035985" watchObservedRunningTime="2026-03-20 11:02:14.33721725 +0000 UTC m=+458.558578148" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.292428 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.292514 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505746 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505782 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505851 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.514333 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.514682 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.517239 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.536075 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.614110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.619748 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.715165 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.303391 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.303989 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.333867 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrj5v" podUID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerName="registry-server" probeResult="failure" output=< Mar 20 11:02:18 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:02:18 crc kubenswrapper[4860]: > Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.339515 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f298e33c11e6b58e469b348abad4c52ecd23dda75a18e6fcff7999ea1729b44e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.339565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"262672598caa4e4a0085905ae9658928784b225b2f71ee03f7b7fe31491f435e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341723 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"002c0b7c930a53311b4db7b99e5c4e0aa5ce0bfcae9c9717d370ec94841d28c5"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7ec992ed2adce225654dbd6f832cdfc256c8d935244ceb0addad630e60b9ba73"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341916 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.343702 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b906bee3273b9e266d2ede1cbd8a9d8d030f9120dad40f07f2edf1d66eb0fd8e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.343753 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3decbd652db47b6a9f28fcac12f076a7a7f283d40c4777cd025604439b3c1fbd"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.349606 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.405909 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.724799 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.725278 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.773682 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.402067 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.693884 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.693971 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.747931 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:21 crc kubenswrapper[4860]: I0320 11:02:21.434884 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:22 crc kubenswrapper[4860]: I0320 11:02:22.344858 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:22 crc kubenswrapper[4860]: I0320 11:02:22.344934 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:25 crc kubenswrapper[4860]: I0320 11:02:25.575968 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:25 crc kubenswrapper[4860]: I0320 11:02:25.668712 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:27 crc kubenswrapper[4860]: I0320 11:02:27.350965 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:27 crc kubenswrapper[4860]: I0320 11:02:27.439629 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:50 crc kubenswrapper[4860]: I0320 11:02:50.732130 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" containerID="cri-o://f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" gracePeriod=30 Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.091913 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.216601 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.216671 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218093 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218155 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218331 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218375 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218514 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218641 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.220192 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.220205 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.225383 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj" (OuterVolumeSpecName: "kube-api-access-n92tj") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "kube-api-access-n92tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226332 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226706 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226813 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.232856 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.241627 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320612 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320682 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320700 4860 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320719 4860 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320741 4860 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320757 4860 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320773 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.552981 4860 generic.go:334] "Generic (PLEG): container finished" podID="39b41087-226b-4f73-9fc4-64616b430f2d" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" exitCode=0 Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553049 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerDied","Data":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553087 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerDied","Data":"e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b"} Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553111 4860 scope.go:117] "RemoveContainer" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553126 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.580064 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.585863 4860 scope.go:117] "RemoveContainer" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: E0320 11:02:51.586726 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": container with ID starting with f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8 not found: ID does not exist" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.586766 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} err="failed to get container status \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": rpc error: code = NotFound desc = could not find container \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": container with ID starting with f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8 not found: ID does not exist" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.587885 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.344892 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.345359 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.345435 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.346196 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.346269 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" gracePeriod=600 Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563249 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" exitCode=0 Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563275 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563418 4860 scope.go:117] "RemoveContainer" containerID="2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" Mar 20 11:02:53 crc kubenswrapper[4860]: I0320 11:02:53.429267 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" path="/var/lib/kubelet/pods/39b41087-226b-4f73-9fc4-64616b430f2d/volumes" Mar 20 11:02:53 crc kubenswrapper[4860]: I0320 11:02:53.575293 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} Mar 20 11:02:57 crc kubenswrapper[4860]: I0320 11:02:57.625782 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.143753 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:00 crc kubenswrapper[4860]: E0320 11:04:00.144957 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.144973 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.146947 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.147525 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.150741 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.150827 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.154712 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.155503 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.267650 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.368941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.389539 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.476109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:01 crc kubenswrapper[4860]: I0320 11:04:01.536629 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:01 crc kubenswrapper[4860]: I0320 11:04:01.548327 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:04:02 crc kubenswrapper[4860]: I0320 11:04:02.029381 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerStarted","Data":"d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8"} Mar 20 11:04:03 crc kubenswrapper[4860]: I0320 11:04:03.035832 4860 generic.go:334] "Generic (PLEG): container finished" podID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerID="d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e" exitCode=0 Mar 20 11:04:03 crc kubenswrapper[4860]: I0320 11:04:03.035944 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerDied","Data":"d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e"} Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.270207 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.323273 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"d72f65a1-efc6-45f5-a056-d64ee5bce755\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.331851 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4" (OuterVolumeSpecName: "kube-api-access-j2hc4") pod "d72f65a1-efc6-45f5-a056-d64ee5bce755" (UID: "d72f65a1-efc6-45f5-a056-d64ee5bce755"). InnerVolumeSpecName "kube-api-access-j2hc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.425388 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") on node \"crc\" DevicePath \"\"" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050350 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerDied","Data":"d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8"} Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050397 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050405 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.339187 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.343399 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.421696 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" path="/var/lib/kubelet/pods/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236/volumes" Mar 20 11:04:40 crc kubenswrapper[4860]: I0320 11:04:40.711994 4860 scope.go:117] "RemoveContainer" containerID="5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772" Mar 20 11:04:52 crc kubenswrapper[4860]: I0320 11:04:52.344997 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:04:52 crc kubenswrapper[4860]: I0320 11:04:52.345785 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:22 crc kubenswrapper[4860]: I0320 11:05:22.344724 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:22 crc kubenswrapper[4860]: I0320 11:05:22.345672 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.751700 4860 scope.go:117] "RemoveContainer" containerID="133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.800552 4860 scope.go:117] "RemoveContainer" containerID="a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.818702 4860 scope.go:117] "RemoveContainer" containerID="e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.344993 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.346197 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.346291 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.347158 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.347239 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" gracePeriod=600 Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.776516 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" exitCode=0 Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.776602 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.777058 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.777098 4860 scope.go:117] "RemoveContainer" containerID="13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.139746 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: E0320 11:06:00.141037 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141058 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141185 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144141 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144885 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144943 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.166461 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.187071 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.288464 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.312027 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.466640 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.699916 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.830505 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerStarted","Data":"0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940"} Mar 20 11:06:02 crc kubenswrapper[4860]: I0320 11:06:02.846090 4860 generic.go:334] "Generic (PLEG): container finished" podID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerID="6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03" exitCode=0 Mar 20 11:06:02 crc kubenswrapper[4860]: I0320 11:06:02.846147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerDied","Data":"6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03"} Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.075641 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.139164 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"200c6cd9-8753-4805-9a49-50d3e429ea33\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.146523 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn" (OuterVolumeSpecName: "kube-api-access-9qrbn") pod "200c6cd9-8753-4805-9a49-50d3e429ea33" (UID: "200c6cd9-8753-4805-9a49-50d3e429ea33"). InnerVolumeSpecName "kube-api-access-9qrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.240451 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") on node \"crc\" DevicePath \"\"" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858279 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerDied","Data":"0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940"} Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858322 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858324 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940" Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.140493 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.144207 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.424737 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" path="/var/lib/kubelet/pods/b31d1240-ea69-4da9-9a40-70f252222d4d/volumes" Mar 20 11:07:40 crc kubenswrapper[4860]: I0320 11:07:40.876948 4860 scope.go:117] "RemoveContainer" containerID="e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362" Mar 20 11:07:52 crc kubenswrapper[4860]: I0320 11:07:52.344565 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:07:52 crc kubenswrapper[4860]: I0320 11:07:52.345481 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.140083 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:00 crc kubenswrapper[4860]: E0320 11:08:00.141046 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141064 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141219 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141739 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.143999 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.144142 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.144311 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.182730 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.334161 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.436606 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.457884 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.463484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.876981 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:01 crc kubenswrapper[4860]: I0320 11:08:01.583583 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerStarted","Data":"9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6"} Mar 20 11:08:02 crc kubenswrapper[4860]: I0320 11:08:02.590262 4860 generic.go:334] "Generic (PLEG): container finished" podID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerID="478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3" exitCode=0 Mar 20 11:08:02 crc kubenswrapper[4860]: I0320 11:08:02.590362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerDied","Data":"478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3"} Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.839588 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.982803 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.991274 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh" (OuterVolumeSpecName: "kube-api-access-b7pxh") pod "d05f5e64-f0ec-45f9-a491-7dde7bdf6538" (UID: "d05f5e64-f0ec-45f9-a491-7dde7bdf6538"). InnerVolumeSpecName "kube-api-access-b7pxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.084583 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.606685 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerDied","Data":"9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6"} Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.607057 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.607126 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.909911 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.916877 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:08:05 crc kubenswrapper[4860]: I0320 11:08:05.422634 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980f5756-4935-469d-933b-f4e339ded9a4" path="/var/lib/kubelet/pods/980f5756-4935-469d-933b-f4e339ded9a4/volumes" Mar 20 11:08:22 crc kubenswrapper[4860]: I0320 11:08:22.344140 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:08:22 crc kubenswrapper[4860]: I0320 11:08:22.345329 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:36 crc kubenswrapper[4860]: I0320 11:08:36.725264 4860 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 11:08:40 crc kubenswrapper[4860]: I0320 11:08:40.931999 4860 scope.go:117] "RemoveContainer" containerID="5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.344961 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346058 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346124 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346933 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.347005 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" gracePeriod=600 Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.903193 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" exitCode=0 Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904870 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904894 4860 scope.go:117] "RemoveContainer" containerID="bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.491170 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492466 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" containerID="cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" containerID="cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492587 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492635 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" containerID="cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492587 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" containerID="cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492617 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" containerID="cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492893 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" containerID="cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.632118 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" containerID="cri-o://4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.944514 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.947561 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-acl-logging/0.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.948070 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-controller/0.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.948658 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018150 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dbgf2"] Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018613 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018638 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018657 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018671 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018696 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018710 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018727 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018739 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018755 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018768 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018785 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kubecfg-setup" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018797 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kubecfg-setup" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018813 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018856 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018870 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018882 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018900 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018912 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018930 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018944 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018963 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018975 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018989 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019001 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019170 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019194 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019217 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019370 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019386 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019400 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019417 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019435 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019455 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019467 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019483 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019495 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.019657 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019671 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019838 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.020003 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.020028 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.022948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045247 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045364 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045411 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045440 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045507 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045554 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045597 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045642 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045674 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045724 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045764 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045792 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045834 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045865 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045896 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045929 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045973 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046027 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046061 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046093 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046365 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046408 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046442 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046488 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046528 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046561 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046594 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046660 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046697 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046738 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046806 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046840 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046872 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046903 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046945 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046979 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047015 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047044 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047075 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047173 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047207 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047268 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047269 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047296 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047317 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047354 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047540 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log" (OuterVolumeSpecName: "node-log") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047571 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048103 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket" (OuterVolumeSpecName: "log-socket") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048147 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash" (OuterVolumeSpecName: "host-slash") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048174 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048201 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048310 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048420 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048454 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.055248 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp" (OuterVolumeSpecName: "kube-api-access-l9btp") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "kube-api-access-l9btp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.057945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.065916 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148112 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148188 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148200 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148254 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148320 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148359 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148394 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148429 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148458 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148490 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148522 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148616 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148683 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148716 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148747 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148843 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148868 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148886 4860 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148904 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148922 4860 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148938 4860 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148956 4860 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148975 4860 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148991 4860 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149009 4860 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149025 4860 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149045 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149063 4860 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149081 4860 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149533 4860 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150297 4860 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150311 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149212 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149281 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149341 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150404 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149374 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150430 4860 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150478 4860 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149637 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150497 4860 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149711 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149674 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149723 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149313 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150656 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149680 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150212 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149402 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.156433 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.166874 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.267865 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.273053 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-acl-logging/0.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.273650 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-controller/0.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275033 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275070 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275078 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275085 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275093 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275082 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275142 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275157 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275100 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275171 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275209 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275205 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" exitCode=143 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275251 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" exitCode=143 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275186 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275456 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275476 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275483 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275489 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275495 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275501 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275507 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275520 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275526 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275545 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275552 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275558 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275567 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275572 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275578 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275586 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275594 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275600 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275606 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275613 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275622 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275629 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275634 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275640 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275645 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275650 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275656 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275661 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275666 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275673 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275689 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275696 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275702 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275708 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275714 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275720 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275725 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275730 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275736 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275741 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.278693 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.279870 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.279973 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764" exitCode=2 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280068 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280121 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280706 4860 scope.go:117] "RemoveContainer" containerID="bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.306511 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.333354 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.334211 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.337965 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.346099 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.367141 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.389986 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.417854 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.422110 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" path="/var/lib/kubelet/pods/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/volumes" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.438240 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.482036 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.510106 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.545456 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.606265 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.606946 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607005 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607043 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.607419 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607457 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607486 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.607748 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607782 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607811 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608035 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608064 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608079 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608362 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608404 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608430 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608664 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608695 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608712 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609127 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609150 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609163 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609413 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609434 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609454 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609943 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609973 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609992 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.610210 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610246 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610262 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610442 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610462 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610701 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610724 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610935 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610962 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611212 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611257 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611477 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611499 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612124 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612153 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612407 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612441 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612879 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612900 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613131 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613162 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613390 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613422 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613636 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613665 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613868 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613890 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614054 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614078 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614302 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614322 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614530 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614560 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614892 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614915 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615139 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615164 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615443 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615472 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615745 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615766 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615966 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615988 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616190 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616209 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616458 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616476 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616664 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616871 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616894 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617073 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617090 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617276 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617295 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617513 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617535 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617778 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617802 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618048 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618070 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618309 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618340 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618561 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.315793 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.317746 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.317827 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"7c703638b7464205d064c1a5b6a628f2894ada53a1c2e318b74addf7a4cc0084"} Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321670 4860 generic.go:334] "Generic (PLEG): container finished" podID="fdde4a9b-30f5-42b1-8f84-d913cea8ddc8" containerID="52d88f1b9c02de4f8c5cab45a771fc2befc2a853cd5323af863096c67fea59eb" exitCode=0 Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321733 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerDied","Data":"52d88f1b9c02de4f8c5cab45a771fc2befc2a853cd5323af863096c67fea59eb"} Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321769 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"02f04e9c289e46cd553ca2efd780e95ad4b4557fff980223eedbc54acb01029d"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.341323 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"bc40f58fb5bcf926371f71ee5e80772f04da47e542e7a49fe7413b7e508c8c04"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342025 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"a6a6b3d2a971733c4b2f82419ebecf7a2d34500d575921ba4c12307aa3024e00"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342038 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"2d5efb607715bd0c42854e8ae2338908f90853fadc4ab20d761bdd4ec974649d"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342047 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"075796bbba6e7b4c490fe7e2252b847f258dea25f9078e7e5ce42fa3c844b6a2"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342056 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"6f70af02015847e71789804e8346eea16be92700864b6ada1b5c6fbcc8fa9e6a"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342064 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"516cb1dd67c70f5a3925ba5b7ff2619cd2629fd2d506cea8c8bc76b04bc2d491"} Mar 20 11:09:55 crc kubenswrapper[4860]: I0320 11:09:55.359533 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"0d5c317356089b76000465b1cf9a8c1fd521aaadf094be271294e8d5e4f0ff19"} Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.306406 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.308247 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.310804 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.311153 4860 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-vn2jz" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.311322 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.312264 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.355892 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.356008 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.356079 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.386888 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"b0f1e383251f12ec61e4e27889258b8f20deff78aecf28fa42474323abad2c07"} Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.387301 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.424206 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.427196 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" podStartSLOduration=8.427174249 podStartE2EDuration="8.427174249s" podCreationTimestamp="2026-03-20 11:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:09:58.423124038 +0000 UTC m=+922.644484956" watchObservedRunningTime="2026-03-20 11:09:58.427174249 +0000 UTC m=+922.648535147" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458089 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458267 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.460087 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.460757 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.483713 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.628791 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667020 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667118 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667144 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667198 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-g98c9" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.332079 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.392923 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393525 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393583 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393644 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434731 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434815 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434845 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434913 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-g98c9" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.438713 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.135454 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.136720 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140468 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140589 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140832 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.147071 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.185532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.287300 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.315814 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.455299 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501324 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501840 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501871 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501946 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" Mar 20 11:10:01 crc kubenswrapper[4860]: I0320 11:10:01.404491 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: I0320 11:10:01.405597 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434178 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434279 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434303 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434359 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.415545 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.625977 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.635047 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:10:11 crc kubenswrapper[4860]: I0320 11:10:11.471893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerStarted","Data":"63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c"} Mar 20 11:10:12 crc kubenswrapper[4860]: I0320 11:10:12.479878 4860 generic.go:334] "Generic (PLEG): container finished" podID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerID="8bc4373ffa212d4a837568628c26dd14b62ba6066892f6d72ca8bf7d4caad612" exitCode=0 Mar 20 11:10:12 crc kubenswrapper[4860]: I0320 11:10:12.479928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerDied","Data":"8bc4373ffa212d4a837568628c26dd14b62ba6066892f6d72ca8bf7d4caad612"} Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.727908 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778092 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778282 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778357 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.779080 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.785832 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d" (OuterVolumeSpecName: "kube-api-access-8qw9d") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "kube-api-access-8qw9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.795839 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879782 4860 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879826 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879839 4860 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.413964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.415489 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494724 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerDied","Data":"63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c"} Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494779 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494859 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.609006 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:14 crc kubenswrapper[4860]: W0320 11:10:14.617848 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb45dae17_b8e6_4d57_a525_2892e7ff37f7.slice/crio-000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036 WatchSource:0}: Error finding container 000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036: Status 404 returned error can't find the container with id 000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036 Mar 20 11:10:15 crc kubenswrapper[4860]: I0320 11:10:15.502687 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerStarted","Data":"000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036"} Mar 20 11:10:16 crc kubenswrapper[4860]: I0320 11:10:16.510367 4860 generic.go:334] "Generic (PLEG): container finished" podID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerID="bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615" exitCode=0 Mar 20 11:10:16 crc kubenswrapper[4860]: I0320 11:10:16.510415 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerDied","Data":"bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615"} Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.751572 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.832751 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.841112 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb" (OuterVolumeSpecName: "kube-api-access-pjsbb") pod "b45dae17-b8e6-4d57-a525-2892e7ff37f7" (UID: "b45dae17-b8e6-4d57-a525-2892e7ff37f7"). InnerVolumeSpecName "kube-api-access-pjsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.935117 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525585 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerDied","Data":"000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036"} Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525655 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.826824 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.832819 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:10:19 crc kubenswrapper[4860]: I0320 11:10:19.421423 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" path="/var/lib/kubelet/pods/d72f65a1-efc6-45f5-a056-d64ee5bce755/volumes" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.618104 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:20 crc kubenswrapper[4860]: E0320 11:10:20.618952 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.618972 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: E0320 11:10:20.618993 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619001 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619121 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619138 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.620093 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.622687 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.629830 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673264 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673321 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775176 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775271 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.776130 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.794289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.944297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.157672 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:21 crc kubenswrapper[4860]: W0320 11:10:21.164473 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4de49a_fca6_4c8c_8484_461859f95884.slice/crio-ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd WatchSource:0}: Error finding container ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd: Status 404 returned error can't find the container with id ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.374366 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.545586 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerStarted","Data":"9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd"} Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.546503 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerStarted","Data":"ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd"} Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.552954 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd" exitCode=0 Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.553018 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd"} Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.735093 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.736978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.751997 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805557 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805638 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805668 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906648 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906670 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.907637 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.907990 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.940315 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.056266 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.287651 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561137 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" exitCode=0 Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561197 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364"} Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561684 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"7ba86a3ba39138c4bcf143a2055d72aa4da66a6fe5dfcdc6f44ec0ae82cefec5"} Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.577216 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.580027 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="2733325d085ecbc581f2fe5823956c910410e5a3d79cfb6bd9c2dc10591bf08e" exitCode=0 Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.580079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"2733325d085ecbc581f2fe5823956c910410e5a3d79cfb6bd9c2dc10591bf08e"} Mar 20 11:10:25 crc kubenswrapper[4860]: I0320 11:10:25.588582 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="c2c4cd967c16c24218315f5eedf406be6b1dec547a49ac6f6d3c445bafe22901" exitCode=0 Mar 20 11:10:25 crc kubenswrapper[4860]: I0320 11:10:25.588803 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"c2c4cd967c16c24218315f5eedf406be6b1dec547a49ac6f6d3c445bafe22901"} Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.598525 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" exitCode=0 Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.598597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.946697 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064660 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064841 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.065663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle" (OuterVolumeSpecName: "bundle") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.072349 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht" (OuterVolumeSpecName: "kube-api-access-xhxht") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "kube-api-access-xhxht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.075605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util" (OuterVolumeSpecName: "util") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165931 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165973 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165993 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607326 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd"} Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607721 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607341 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.611708 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.630213 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9j9j" podStartSLOduration=2.115647349 podStartE2EDuration="5.630182772s" podCreationTimestamp="2026-03-20 11:10:22 +0000 UTC" firstStartedPulling="2026-03-20 11:10:23.562805017 +0000 UTC m=+947.784165915" lastFinishedPulling="2026-03-20 11:10:27.07734044 +0000 UTC m=+951.298701338" observedRunningTime="2026-03-20 11:10:27.629620517 +0000 UTC m=+951.850981415" watchObservedRunningTime="2026-03-20 11:10:27.630182772 +0000 UTC m=+951.851543690" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046108 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046917 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="util" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046937 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="util" Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046959 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="pull" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046969 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="pull" Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046982 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046993 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.047135 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.047715 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.050512 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.050964 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c67w4" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.051283 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.068530 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.122511 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.223653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.244626 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.367332 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.616549 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.651332 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" event={"ID":"ce7d9f29-28cd-4038-b492-b18e0b129907","Type":"ContainerStarted","Data":"3b92c0467e344677f1da4a7e33230a9b30dc7ba817ecf222a202b4656bb4d20f"} Mar 20 11:10:33 crc kubenswrapper[4860]: I0320 11:10:33.057128 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:33 crc kubenswrapper[4860]: I0320 11:10:33.057923 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.101939 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b9j9j" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" probeResult="failure" output=< Mar 20 11:10:34 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:10:34 crc kubenswrapper[4860]: > Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.673563 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" event={"ID":"ce7d9f29-28cd-4038-b492-b18e0b129907","Type":"ContainerStarted","Data":"085d319dc17f08845c93b80c65b00d0c34e8491a854672021fff83f2f6fa35f8"} Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.692430 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" podStartSLOduration=0.91749681 podStartE2EDuration="3.692411405s" podCreationTimestamp="2026-03-20 11:10:31 +0000 UTC" firstStartedPulling="2026-03-20 11:10:31.621325994 +0000 UTC m=+955.842686892" lastFinishedPulling="2026-03-20 11:10:34.396240589 +0000 UTC m=+958.617601487" observedRunningTime="2026-03-20 11:10:34.691504861 +0000 UTC m=+958.912865759" watchObservedRunningTime="2026-03-20 11:10:34.692411405 +0000 UTC m=+958.913772303" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.622839 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.625073 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.627586 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rwxzd" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.640838 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.647468 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.648546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.661803 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.663517 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.680597 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.690452 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mdh82"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.691676 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763240 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763816 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.764040 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.764140 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.792493 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.813382 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.814243 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.816437 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8qkr7" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.816941 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.823648 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.828783 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.866912 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867026 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867069 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867142 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867174 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867239 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867733 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.867789 4860 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867857 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.867892 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair podName:db5d41a4-2808-4189-8c3e-e0730cdf1a4f nodeName:}" failed. No retries permitted until 2026-03-20 11:10:41.367870218 +0000 UTC m=+965.589231116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair") pod "nmstate-webhook-5f558f5558-9cfpv" (UID: "db5d41a4-2808-4189-8c3e-e0730cdf1a4f") : secret "openshift-nmstate-webhook" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867915 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867960 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.868139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.868540 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.887024 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.893683 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.947848 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968685 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968749 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968772 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.968968 4860 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.969034 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert podName:a91c6f2b-7646-4f4d-bdc2-47304e36da4e nodeName:}" failed. No retries permitted until 2026-03-20 11:10:41.469015153 +0000 UTC m=+965.690376051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-l98lx" (UID: "a91c6f2b-7646-4f4d-bdc2-47304e36da4e") : secret "plugin-serving-cert" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.970482 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:40.999927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.001513 4860 scope.go:117] "RemoveContainer" containerID="d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.023964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.032128 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.033129 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.053352 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.065511 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.070802 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071402 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071431 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071467 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071510 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071560 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071596 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173338 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173748 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173780 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173809 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173839 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173893 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173937 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.175849 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.176407 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.177117 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.177479 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.189092 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.192004 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.201364 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.293711 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.375731 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.376242 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.384177 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.477656 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.483790 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.575972 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.576415 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: W0320 11:10:41.584246 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d7da76_e33e_46c9_a2ef_1715fb8e8500.slice/crio-5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f WatchSource:0}: Error finding container 5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f: Status 404 returned error can't find the container with id 5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.734843 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.736524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mdh82" event={"ID":"ef7f3b63-3a7d-483b-95c1-32961dad6226","Type":"ContainerStarted","Data":"a27495590bd1154ea3d0454792b6a5b090f14e5d389499e7f55b1cf8c57d0a90"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.738794 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc764bbd9-xmdhb" event={"ID":"42d7da76-e33e-46c9-a2ef-1715fb8e8500","Type":"ContainerStarted","Data":"5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.739846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"f74e0b7a674065bb9df6ebca3548534d5143eca7602fb403d78610a6263a7e79"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.757413 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.784567 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:41 crc kubenswrapper[4860]: W0320 11:10:41.837264 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5d41a4_2808_4189_8c3e_e0730cdf1a4f.slice/crio-563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d WatchSource:0}: Error finding container 563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d: Status 404 returned error can't find the container with id 563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.014134 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.749962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" event={"ID":"db5d41a4-2808-4189-8c3e-e0730cdf1a4f","Type":"ContainerStarted","Data":"563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.754607 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc764bbd9-xmdhb" event={"ID":"42d7da76-e33e-46c9-a2ef-1715fb8e8500","Type":"ContainerStarted","Data":"adf76be9e6a417f21ea9010902bc5927911fceabcd94bbab61923ca33fb1801d"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.756079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" event={"ID":"a91c6f2b-7646-4f4d-bdc2-47304e36da4e","Type":"ContainerStarted","Data":"d30eceaa7d60c69131cb3e850cb2ea6ba9d5379a363fc20c5a6c5894895c4402"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.779816 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cc764bbd9-xmdhb" podStartSLOduration=1.779786933 podStartE2EDuration="1.779786933s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:42.775369272 +0000 UTC m=+966.996730190" watchObservedRunningTime="2026-03-20 11:10:42.779786933 +0000 UTC m=+967.001147831" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.144050 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.222466 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.383085 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.791011 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"bebfa7562c61cae81e8ea8e2b71c6a147d1fcfc6bb35c09497bd07de9d2ae333"} Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.795218 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9j9j" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" containerID="cri-o://cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" gracePeriod=2 Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.796206 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" event={"ID":"db5d41a4-2808-4189-8c3e-e0730cdf1a4f","Type":"ContainerStarted","Data":"22a8f73c83319d6ecc9355fa4c7d585428493a897367a07354f1f2674beb3e72"} Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.796247 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.826675 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" podStartSLOduration=2.115690119 podStartE2EDuration="4.826653826s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.840334602 +0000 UTC m=+966.061695500" lastFinishedPulling="2026-03-20 11:10:44.551298299 +0000 UTC m=+968.772659207" observedRunningTime="2026-03-20 11:10:44.824396794 +0000 UTC m=+969.045757692" watchObservedRunningTime="2026-03-20 11:10:44.826653826 +0000 UTC m=+969.048014724" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.153281 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346640 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346751 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.348163 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities" (OuterVolumeSpecName: "utilities") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.354169 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs" (OuterVolumeSpecName: "kube-api-access-q7tgs") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "kube-api-access-q7tgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.450744 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.450789 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.484675 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.552190 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.468896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mdh82" event={"ID":"ef7f3b63-3a7d-483b-95c1-32961dad6226","Type":"ContainerStarted","Data":"8344bd2879de4da2b9ad33341da7217357530e39b5cb4b8649c24afd700d6f1a"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.468987 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.477922 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" exitCode=0 Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.478010 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.480878 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"7ba86a3ba39138c4bcf143a2055d72aa4da66a6fe5dfcdc6f44ec0ae82cefec5"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.478060 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.480915 4860 scope.go:117] "RemoveContainer" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.489958 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mdh82" podStartSLOduration=3.026331702 podStartE2EDuration="6.489882222s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.096346814 +0000 UTC m=+965.317707712" lastFinishedPulling="2026-03-20 11:10:44.559897334 +0000 UTC m=+968.781258232" observedRunningTime="2026-03-20 11:10:46.48686421 +0000 UTC m=+970.708225108" watchObservedRunningTime="2026-03-20 11:10:46.489882222 +0000 UTC m=+970.711243120" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.526973 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.532042 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.559214 4860 scope.go:117] "RemoveContainer" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.589046 4860 scope.go:117] "RemoveContainer" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.730342 4860 scope.go:117] "RemoveContainer" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.731004 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": container with ID starting with cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760 not found: ID does not exist" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731074 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} err="failed to get container status \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": rpc error: code = NotFound desc = could not find container \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": container with ID starting with cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760 not found: ID does not exist" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731117 4860 scope.go:117] "RemoveContainer" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.731632 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": container with ID starting with e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd not found: ID does not exist" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731694 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} err="failed to get container status \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": rpc error: code = NotFound desc = could not find container \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": container with ID starting with e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd not found: ID does not exist" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731741 4860 scope.go:117] "RemoveContainer" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.732165 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": container with ID starting with 6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364 not found: ID does not exist" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.732189 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364"} err="failed to get container status \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": rpc error: code = NotFound desc = could not find container \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": container with ID starting with 6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364 not found: ID does not exist" Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.427737 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" path="/var/lib/kubelet/pods/be396660-5e1e-4dfe-9a08-26b2fcc69a0a/volumes" Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.493788 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" event={"ID":"a91c6f2b-7646-4f4d-bdc2-47304e36da4e","Type":"ContainerStarted","Data":"e0069371eb676eef23776e18e272defe4de88da747fe9fd9340cf99f976033ac"} Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.515625 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" podStartSLOduration=2.788907831 podStartE2EDuration="7.51558994s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:42.02646922 +0000 UTC m=+966.247830118" lastFinishedPulling="2026-03-20 11:10:46.753151329 +0000 UTC m=+970.974512227" observedRunningTime="2026-03-20 11:10:47.510925813 +0000 UTC m=+971.732286731" watchObservedRunningTime="2026-03-20 11:10:47.51558994 +0000 UTC m=+971.736950838" Mar 20 11:10:48 crc kubenswrapper[4860]: I0320 11:10:48.503747 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"54ba53c2a41cd712f4f8515e69ca2d5dd8e9b98b94af121599a9b32c107d5efb"} Mar 20 11:10:48 crc kubenswrapper[4860]: I0320 11:10:48.530505 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" podStartSLOduration=1.9662549139999999 podStartE2EDuration="8.530479038s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.31091673 +0000 UTC m=+965.532277628" lastFinishedPulling="2026-03-20 11:10:47.875140854 +0000 UTC m=+972.096501752" observedRunningTime="2026-03-20 11:10:48.524815335 +0000 UTC m=+972.746176233" watchObservedRunningTime="2026-03-20 11:10:48.530479038 +0000 UTC m=+972.751839936" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.053620 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.376763 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.376842 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.383637 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.527356 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.585078 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:10:52 crc kubenswrapper[4860]: I0320 11:10:52.344535 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:10:52 crc kubenswrapper[4860]: I0320 11:10:52.345080 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:01 crc kubenswrapper[4860]: I0320 11:11:01.582356 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.864918 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866099 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866116 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866139 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-utilities" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866146 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-utilities" Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866155 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-content" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866161 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-content" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866316 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.867309 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.872150 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.877759 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002807 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002955 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104672 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104699 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.105365 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.105420 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.126218 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.194216 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.608855 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.701435 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerStarted","Data":"6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b"} Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.631269 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" containerID="cri-o://589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" gracePeriod=15 Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.710627 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="22d182dcb02a0ca2445f79c27a70decf1dd99ec7371a15142ed192e16a2e8fa2" exitCode=0 Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.710684 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"22d182dcb02a0ca2445f79c27a70decf1dd99ec7371a15142ed192e16a2e8fa2"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.016398 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqrz5_e8ca532e-b0d7-494c-886f-bff0c8009707/console/0.log" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.016503 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137428 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137571 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137610 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137662 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137796 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137846 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137872 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138256 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config" (OuterVolumeSpecName: "console-config") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138315 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138727 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca" (OuterVolumeSpecName: "service-ca") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.139154 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.144742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.145803 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.147079 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz" (OuterVolumeSpecName: "kube-api-access-lk8bz") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "kube-api-access-lk8bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239355 4860 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239402 4860 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239411 4860 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239424 4860 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239443 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239476 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239488 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718672 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqrz5_e8ca532e-b0d7-494c-886f-bff0c8009707/console/0.log" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718740 4860 generic.go:334] "Generic (PLEG): container finished" podID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" exitCode=2 Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718779 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerDied","Data":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718809 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerDied","Data":"72e1e1c0612e639b5d9b1dd93371fee28768245c503b21f6343128336d8f4145"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718831 4860 scope.go:117] "RemoveContainer" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.719038 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.742944 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.748039 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.748295 4860 scope.go:117] "RemoveContainer" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: E0320 11:11:17.749186 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": container with ID starting with 589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d not found: ID does not exist" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.749216 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} err="failed to get container status \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": rpc error: code = NotFound desc = could not find container \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": container with ID starting with 589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d not found: ID does not exist" Mar 20 11:11:18 crc kubenswrapper[4860]: I0320 11:11:18.730251 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="5c8185a2c1b9ec4fbf156815fc31a2380dc1abffd3d79fed1741875444c47732" exitCode=0 Mar 20 11:11:18 crc kubenswrapper[4860]: I0320 11:11:18.730298 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"5c8185a2c1b9ec4fbf156815fc31a2380dc1abffd3d79fed1741875444c47732"} Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.421852 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" path="/var/lib/kubelet/pods/e8ca532e-b0d7-494c-886f-bff0c8009707/volumes" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.622806 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:19 crc kubenswrapper[4860]: E0320 11:11:19.623129 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.623145 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.623324 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.624715 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.634540 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.741029 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="0a9e0ba5e1736f05b3ed3f26a2231469f8608a5402919e94471951eed9b9dfc6" exitCode=0 Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.741087 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"0a9e0ba5e1736f05b3ed3f26a2231469f8608a5402919e94471951eed9b9dfc6"} Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780717 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780805 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780855 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881817 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881950 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.882509 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.883214 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.905129 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.950678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.189260 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.747959 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c" exitCode=0 Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.748053 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c"} Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.748436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"41dfd9902a89b0dfeabbbcc8c55d12c6c1350b3b1f92b4a588be3bf82690038f"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.017849 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.019286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.025883 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.052979 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100062 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100132 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100173 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201451 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201556 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201600 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201873 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201911 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201946 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.202593 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.202777 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.203199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle" (OuterVolumeSpecName: "bundle") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.217353 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf" (OuterVolumeSpecName: "kube-api-access-6frwf") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "kube-api-access-6frwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.229142 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.304286 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.304349 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.348195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util" (OuterVolumeSpecName: "util") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.361096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.406066 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.626478 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.761336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.763810 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"e6307789218fa3c6a03a6f158c301aa7e53133c1765dba4abc69e40da0ce5696"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766190 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766288 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766294 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.344633 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.344726 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.775946 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6" exitCode=0 Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.776045 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6"} Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.779141 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610" exitCode=0 Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.779200 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610"} Mar 20 11:11:23 crc kubenswrapper[4860]: I0320 11:11:23.787900 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b"} Mar 20 11:11:23 crc kubenswrapper[4860]: I0320 11:11:23.791024 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d"} Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.800726 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b" exitCode=0 Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.800813 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b"} Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.825350 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nskpj" podStartSLOduration=3.088717664 podStartE2EDuration="5.825324235s" podCreationTimestamp="2026-03-20 11:11:19 +0000 UTC" firstStartedPulling="2026-03-20 11:11:20.750537964 +0000 UTC m=+1004.971898862" lastFinishedPulling="2026-03-20 11:11:23.487144535 +0000 UTC m=+1007.708505433" observedRunningTime="2026-03-20 11:11:23.833484581 +0000 UTC m=+1008.054845479" watchObservedRunningTime="2026-03-20 11:11:24.825324235 +0000 UTC m=+1009.046685133" Mar 20 11:11:25 crc kubenswrapper[4860]: I0320 11:11:25.812409 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb"} Mar 20 11:11:25 crc kubenswrapper[4860]: I0320 11:11:25.848318 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwpk7" podStartSLOduration=3.37822803 podStartE2EDuration="5.848290928s" podCreationTimestamp="2026-03-20 11:11:20 +0000 UTC" firstStartedPulling="2026-03-20 11:11:22.778154065 +0000 UTC m=+1006.999514963" lastFinishedPulling="2026-03-20 11:11:25.248216963 +0000 UTC m=+1009.469577861" observedRunningTime="2026-03-20 11:11:25.843705944 +0000 UTC m=+1010.065066852" watchObservedRunningTime="2026-03-20 11:11:25.848290928 +0000 UTC m=+1010.069651826" Mar 20 11:11:29 crc kubenswrapper[4860]: I0320 11:11:29.951893 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:29 crc kubenswrapper[4860]: I0320 11:11:29.952862 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.007984 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.495759 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496635 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.496733 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496814 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.496881 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496955 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497016 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497263 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497927 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.501554 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502017 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lgrkw" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502082 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502082 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.503378 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.531750 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560268 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560302 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661558 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661695 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.670370 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.674583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.682167 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.769482 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.770881 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.773392 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hzlvn" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.773704 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.774583 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.793678 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.817212 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864630 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.937854 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966400 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966515 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966573 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.973607 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.973694 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.997656 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.037728 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.039114 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.054772 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.089454 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175300 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175874 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175908 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.208043 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:31 crc kubenswrapper[4860]: W0320 11:11:31.239750 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8f951b_6aa9_420c_9bad_dfa857482d4c.slice/crio-1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d WatchSource:0}: Error finding container 1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d: Status 404 returned error can't find the container with id 1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.278122 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.278417 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.325305 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.361810 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.361868 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.389930 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.576835 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.611712 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:31 crc kubenswrapper[4860]: W0320 11:11:31.636816 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb0189c_2177_4c4e_83f6_7ba051322847.slice/crio-88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96 WatchSource:0}: Error finding container 88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96: Status 404 returned error can't find the container with id 88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96 Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.845837 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.881476 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" event={"ID":"bb8f951b-6aa9-420c-9bad-dfa857482d4c","Type":"ContainerStarted","Data":"1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d"} Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.886832 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" event={"ID":"1eb0189c-2177-4c4e-83f6-7ba051322847","Type":"ContainerStarted","Data":"88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96"} Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.973491 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.897784 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282" exitCode=0 Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.898401 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282"} Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.898599 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"70c9e8ca00f897038aed84c357d0f86149e529ad64db01ad212da50e9a0df3da"} Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.662456 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.663169 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nskpj" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" containerID="cri-o://42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" gracePeriod=2 Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.933183 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" exitCode=0 Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.933249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.818696 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912405 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912468 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912488 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.915638 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities" (OuterVolumeSpecName: "utilities") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.922093 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx" (OuterVolumeSpecName: "kube-api-access-ngdgx") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "kube-api-access-ngdgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.942798 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" event={"ID":"bb8f951b-6aa9-420c-9bad-dfa857482d4c","Type":"ContainerStarted","Data":"2807ec143e9f4b047895d4a3fdab36e21f03e518540ddd30bffa5f634ec12d7f"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.943064 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.949510 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.954076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958246 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"41dfd9902a89b0dfeabbbcc8c55d12c6c1350b3b1f92b4a588be3bf82690038f"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958351 4860 scope.go:117] "RemoveContainer" containerID="42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958289 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.975720 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" podStartSLOduration=1.65905661 podStartE2EDuration="5.97569206s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:31.25613017 +0000 UTC m=+1015.477491058" lastFinishedPulling="2026-03-20 11:11:35.57276561 +0000 UTC m=+1019.794126508" observedRunningTime="2026-03-20 11:11:35.963341597 +0000 UTC m=+1020.184702485" watchObservedRunningTime="2026-03-20 11:11:35.97569206 +0000 UTC m=+1020.197052958" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.003841 4860 scope.go:117] "RemoveContainer" containerID="c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014717 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014763 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014777 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.020699 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.035678 4860 scope.go:117] "RemoveContainer" containerID="a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.047681 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.971814 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405" exitCode=0 Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.973301 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405"} Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.423463 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" path="/var/lib/kubelet/pods/659f81bd-8f32-4dd2-9a63-6b2665fa6647/volumes" Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.816535 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.816838 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwpk7" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" containerID="cri-o://fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" gracePeriod=2 Mar 20 11:11:38 crc kubenswrapper[4860]: I0320 11:11:38.003614 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" exitCode=0 Mar 20 11:11:38 crc kubenswrapper[4860]: I0320 11:11:38.003653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb"} Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.191504 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229395 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229495 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229521 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.236712 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities" (OuterVolumeSpecName: "utilities") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.258446 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn" (OuterVolumeSpecName: "kube-api-access-d6jkn") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "kube-api-access-d6jkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.318455 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330923 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330964 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330978 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.018097 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022022 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e6307789218fa3c6a03a6f158c301aa7e53133c1765dba4abc69e40da0ce5696"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022080 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022137 4860 scope.go:117] "RemoveContainer" containerID="fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.024147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" event={"ID":"1eb0189c-2177-4c4e-83f6-7ba051322847","Type":"ContainerStarted","Data":"e891085d91de217023d537d53a0d32496654849cd5143ddadaa349181dfda470"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.024905 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.043778 4860 scope.go:117] "RemoveContainer" containerID="e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.057699 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hmtwf" podStartSLOduration=3.619979625 podStartE2EDuration="10.057678196s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:32.899431643 +0000 UTC m=+1017.120792531" lastFinishedPulling="2026-03-20 11:11:39.337130204 +0000 UTC m=+1023.558491102" observedRunningTime="2026-03-20 11:11:40.052947278 +0000 UTC m=+1024.274308176" watchObservedRunningTime="2026-03-20 11:11:40.057678196 +0000 UTC m=+1024.279039094" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.065516 4860 scope.go:117] "RemoveContainer" containerID="e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.071400 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.084875 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.109269 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" podStartSLOduration=2.817079343 podStartE2EDuration="10.109224065s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:31.645376881 +0000 UTC m=+1015.866737779" lastFinishedPulling="2026-03-20 11:11:38.937521603 +0000 UTC m=+1023.158882501" observedRunningTime="2026-03-20 11:11:40.106687027 +0000 UTC m=+1024.328047935" watchObservedRunningTime="2026-03-20 11:11:40.109224065 +0000 UTC m=+1024.330584963" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.391408 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.391460 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.422008 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" path="/var/lib/kubelet/pods/c65a3188-9f0a-4525-b876-635822d8dda4/volumes" Mar 20 11:11:42 crc kubenswrapper[4860]: I0320 11:11:42.435692 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hmtwf" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" probeResult="failure" output=< Mar 20 11:11:42 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:11:42 crc kubenswrapper[4860]: > Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.095866 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.434007 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.483711 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344737 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344802 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344855 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.345653 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.345720 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" gracePeriod=600 Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.124620 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" exitCode=0 Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.124726 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.125266 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.125306 4860 scope.go:117] "RemoveContainer" containerID="4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" Mar 20 11:11:54 crc kubenswrapper[4860]: I0320 11:11:54.805216 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:54 crc kubenswrapper[4860]: I0320 11:11:54.805998 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hmtwf" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" containerID="cri-o://2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" gracePeriod=2 Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.146773 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" exitCode=0 Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.146838 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46"} Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.304033 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357129 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357208 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357248 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.358625 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities" (OuterVolumeSpecName: "utilities") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.364590 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl" (OuterVolumeSpecName: "kube-api-access-q9trl") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "kube-api-access-q9trl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.411583 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461083 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461146 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461195 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.156966 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"70c9e8ca00f897038aed84c357d0f86149e529ad64db01ad212da50e9a0df3da"} Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.157045 4860 scope.go:117] "RemoveContainer" containerID="2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.157067 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.177922 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.182393 4860 scope.go:117] "RemoveContainer" containerID="9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.186556 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.207003 4860 scope.go:117] "RemoveContainer" containerID="e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282" Mar 20 11:11:57 crc kubenswrapper[4860]: I0320 11:11:57.421340 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" path="/var/lib/kubelet/pods/422d4ef5-4317-467e-b7cb-e20258f2865d/volumes" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.136950 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.138802 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.138882 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.138944 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139002 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139058 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139109 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139169 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139218 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139313 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139381 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139457 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139517 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139572 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139626 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139683 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139741 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139829 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139884 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140042 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140107 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140166 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140727 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.146121 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.149918 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.151209 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.152105 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.322545 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.424369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.447910 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.460877 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.895960 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:01 crc kubenswrapper[4860]: I0320 11:12:01.191126 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerStarted","Data":"6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9"} Mar 20 11:12:02 crc kubenswrapper[4860]: I0320 11:12:02.199915 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerStarted","Data":"986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18"} Mar 20 11:12:02 crc kubenswrapper[4860]: I0320 11:12:02.220820 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" podStartSLOduration=1.418901302 podStartE2EDuration="2.220797147s" podCreationTimestamp="2026-03-20 11:12:00 +0000 UTC" firstStartedPulling="2026-03-20 11:12:00.910742916 +0000 UTC m=+1045.132103814" lastFinishedPulling="2026-03-20 11:12:01.712638761 +0000 UTC m=+1045.933999659" observedRunningTime="2026-03-20 11:12:02.217149078 +0000 UTC m=+1046.438509986" watchObservedRunningTime="2026-03-20 11:12:02.220797147 +0000 UTC m=+1046.442158045" Mar 20 11:12:03 crc kubenswrapper[4860]: I0320 11:12:03.216127 4860 generic.go:334] "Generic (PLEG): container finished" podID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerID="986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18" exitCode=0 Mar 20 11:12:03 crc kubenswrapper[4860]: I0320 11:12:03.216249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerDied","Data":"986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18"} Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.478541 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.481387 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"fdc939b6-92ac-4e00-ae32-b518e4257043\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.490492 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p" (OuterVolumeSpecName: "kube-api-access-vhg5p") pod "fdc939b6-92ac-4e00-ae32-b518e4257043" (UID: "fdc939b6-92ac-4e00-ae32-b518e4257043"). InnerVolumeSpecName "kube-api-access-vhg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.582984 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232412 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerDied","Data":"6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9"} Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232470 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232535 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.274730 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.278158 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.420387 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" path="/var/lib/kubelet/pods/200c6cd9-8753-4805-9a49-50d3e429ea33/volumes" Mar 20 11:12:10 crc kubenswrapper[4860]: I0320 11:12:10.820670 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.472748 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mzzpz"] Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.473531 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.473549 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.473689 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.476084 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.478441 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.478608 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wjgxm" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.483856 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.488916 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.489773 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493368 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493437 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493485 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493572 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493602 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493662 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.495789 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.509785 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.581039 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-brjk7"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.582079 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584017 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kn7gq" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584331 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584496 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.585508 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.592291 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.593566 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594640 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594700 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594751 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594802 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594832 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594872 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594897 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594929 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594951 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594978 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595005 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595083 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595110 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.595739 4860 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.595824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs podName:dcd69c7f-fded-4b09-bd44-607b27716196 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.09580293 +0000 UTC m=+1056.317163828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs") pod "frr-k8s-mzzpz" (UID: "dcd69c7f-fded-4b09-bd44-607b27716196") : secret "frr-k8s-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595901 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596037 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596130 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596191 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596489 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.597286 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.606952 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.645140 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696503 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696622 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696674 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696700 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696724 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696773 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696800 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696833 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696864 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696872 4860 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696959 4860 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697005 4860 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696961 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.196936846 +0000 UTC m=+1056.418297744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "speaker-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697176 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.197140441 +0000 UTC m=+1056.418501469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "metallb-memberlist" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697205 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert podName:3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.197195653 +0000 UTC m=+1056.418556781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert") pod "frr-k8s-webhook-server-bcc4b6f68-jhncx" (UID: "3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de") : secret "frr-k8s-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.698368 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.720141 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.725880 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.798477 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.799016 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.799156 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.800358 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.803528 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.816562 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.820000 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.911829 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.106130 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.118998 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.183372 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207040 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207138 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207210 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: E0320 11:12:12.208328 4860 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:12:12 crc kubenswrapper[4860]: E0320 11:12:12.208430 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:13.208406772 +0000 UTC m=+1057.429767840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "metallb-memberlist" not found Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.218040 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.219126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.277490 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"77826e25b7df8d41ed875fa4900194c9eb97fab2d3708c6b9b4ab8ffb23d5194"} Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.396190 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.417026 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.664168 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:12 crc kubenswrapper[4860]: W0320 11:12:12.676512 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1b54d0_fcfb_451b_ae3d_b731d3f9f6de.slice/crio-1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458 WatchSource:0}: Error finding container 1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458: Status 404 returned error can't find the container with id 1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458 Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.231214 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.239048 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.285650 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"9a5a2a81788b3204aab5ede788399eaab2ede7ab52823dbaa8a0fe1d9a7e8f70"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.286723 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" event={"ID":"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de","Type":"ContainerStarted","Data":"1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289456 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"37874dc77ffe970aa159f172c1f5236af3dd2e530af7ff74f729ce2d30ceae32"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289519 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"9069102d7168139a81f69619e9ce7652a29e25d97c2adad87a3381640d2a24e3"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289862 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.313826 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-tdcbt" podStartSLOduration=2.313802686 podStartE2EDuration="2.313802686s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:13.307827405 +0000 UTC m=+1057.529188303" watchObservedRunningTime="2026-03-20 11:12:13.313802686 +0000 UTC m=+1057.535163584" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.403459 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: W0320 11:12:13.432424 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee4e9c2_66c1_4431_bde4_29d09a044a32.slice/crio-11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288 WatchSource:0}: Error finding container 11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288: Status 404 returned error can't find the container with id 11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288 Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.307760 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"dba30ebe5d612295a23d92d89cb16de60bbd5d2659c7cf3b10afd3a35d49915e"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308395 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"90cd72b4fe99d365a38733a8e261e3e8ea8f26af36dbf447b600e30f50309ec3"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308671 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-brjk7" Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.335792 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-brjk7" podStartSLOduration=3.335759072 podStartE2EDuration="3.335759072s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:14.32973444 +0000 UTC m=+1058.551095338" watchObservedRunningTime="2026-03-20 11:12:14.335759072 +0000 UTC m=+1058.557119970" Mar 20 11:12:23 crc kubenswrapper[4860]: I0320 11:12:23.434197 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-brjk7" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.629363 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" event={"ID":"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de","Type":"ContainerStarted","Data":"eed5c36b42d9cf47b25f131ff575301fdec9e430a45f5f8e32ec4a722ec37b75"} Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.630830 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.632880 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="4dfb9e8ea6a38a44d2ab27fd0047c19010d14d1dae6cef2b3ad3a660a1b7d2c8" exitCode=0 Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.632910 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"4dfb9e8ea6a38a44d2ab27fd0047c19010d14d1dae6cef2b3ad3a660a1b7d2c8"} Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.654989 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" podStartSLOduration=3.028546661 podStartE2EDuration="13.654962904s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="2026-03-20 11:12:12.679175801 +0000 UTC m=+1056.900536699" lastFinishedPulling="2026-03-20 11:12:23.305592044 +0000 UTC m=+1067.526952942" observedRunningTime="2026-03-20 11:12:24.651403628 +0000 UTC m=+1068.872764526" watchObservedRunningTime="2026-03-20 11:12:24.654962904 +0000 UTC m=+1068.876323832" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.858481 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.859984 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.863885 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.870336 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980586 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980720 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082486 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082570 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.083139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.083179 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.104139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.176068 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.425534 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:25 crc kubenswrapper[4860]: W0320 11:12:25.425712 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4498dd_681d_4260_b895_06e53dbcc9b9.slice/crio-5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f WatchSource:0}: Error finding container 5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f: Status 404 returned error can't find the container with id 5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.643394 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="aaefc127b3b814fda46a4fac6c09a1fa82786e8924d06b62a458ba7f97d8564f" exitCode=0 Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.643671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"aaefc127b3b814fda46a4fac6c09a1fa82786e8924d06b62a458ba7f97d8564f"} Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.655338 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerStarted","Data":"9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52"} Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.655780 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerStarted","Data":"5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f"} Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.664380 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="910adb48feabecb49b3a0d84e41546776436b3ec06da2344fee430676505c2cc" exitCode=0 Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.664498 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"910adb48feabecb49b3a0d84e41546776436b3ec06da2344fee430676505c2cc"} Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.666246 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52" exitCode=0 Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.666358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.678590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"6a17b8ad777c84daa1bb9c2527a79acc31ee9625f9a1ca1c55538d2aed2c2f03"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"283879f78be2c3c05ef7653b32c23325593b55e264572056cea75e01c82175cd"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"d953d2007587ca380c18421211fad6db96e5a8196beab17433802aa2491647a6"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"e31ef6f22e13c0c51b83b1537fdd286af21e3e29728a205a071567c260c7496a"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679614 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"5f2bfe9d7a2cd3d2a5943cb7ea54c59e59831333e9b9dab3ffc97cd2b6e20e93"} Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.692664 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"e9dbd54f243509b30ec00213e9b5984ca62a3745f44b34c82fc394f17dbcae44"} Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.692910 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.717868 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mzzpz" podStartSLOduration=6.884554816 podStartE2EDuration="17.717844665s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="2026-03-20 11:12:12.490946678 +0000 UTC m=+1056.712307576" lastFinishedPulling="2026-03-20 11:12:23.324236527 +0000 UTC m=+1067.545597425" observedRunningTime="2026-03-20 11:12:28.716668063 +0000 UTC m=+1072.938028961" watchObservedRunningTime="2026-03-20 11:12:28.717844665 +0000 UTC m=+1072.939205563" Mar 20 11:12:31 crc kubenswrapper[4860]: I0320 11:12:31.918329 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.397555 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.440646 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.753669 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="2716a4ea9ee047cc78eaeeb30ebecda79f9eb56973079b8f9647e018e103efe1" exitCode=0 Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.753746 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"2716a4ea9ee047cc78eaeeb30ebecda79f9eb56973079b8f9647e018e103efe1"} Mar 20 11:12:33 crc kubenswrapper[4860]: I0320 11:12:33.765820 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="db333eaee23207b52536e059c0067f4d72cc446172e50bde67fa1437447b5635" exitCode=0 Mar 20 11:12:33 crc kubenswrapper[4860]: I0320 11:12:33.765902 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"db333eaee23207b52536e059c0067f4d72cc446172e50bde67fa1437447b5635"} Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.026697 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202090 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202318 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202391 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.206212 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle" (OuterVolumeSpecName: "bundle") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.222537 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util" (OuterVolumeSpecName: "util") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.223219 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl" (OuterVolumeSpecName: "kube-api-access-rfjkl") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "kube-api-access-rfjkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305340 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305425 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305439 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f"} Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782661 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782712 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:41 crc kubenswrapper[4860]: I0320 11:12:41.419051 4860 scope.go:117] "RemoveContainer" containerID="6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03" Mar 20 11:12:42 crc kubenswrapper[4860]: I0320 11:12:42.408254 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:42 crc kubenswrapper[4860]: I0320 11:12:42.456183 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094195 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094531 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094550 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094575 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="pull" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094584 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="pull" Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094593 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="util" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094601 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="util" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094714 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.095265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098146 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098280 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-7rn4v" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098403 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.113583 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.222577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.222703 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.323797 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.323873 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.324466 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.346929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.413057 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.733589 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.850653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" event={"ID":"630d2077-3457-4dcf-b9ab-82f77e819c54","Type":"ContainerStarted","Data":"18d35c19543363cefab52b399f58cdc08f4d5374500a57ffc30c5c217835a5c5"} Mar 20 11:12:49 crc kubenswrapper[4860]: I0320 11:12:49.898732 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" event={"ID":"630d2077-3457-4dcf-b9ab-82f77e819c54","Type":"ContainerStarted","Data":"1eb7ab2639b189b7ac10546de5366184b490d9738bbeb42179304524516fad21"} Mar 20 11:12:49 crc kubenswrapper[4860]: I0320 11:12:49.923075 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" podStartSLOduration=0.994380748 podStartE2EDuration="6.923056279s" podCreationTimestamp="2026-03-20 11:12:43 +0000 UTC" firstStartedPulling="2026-03-20 11:12:43.748578863 +0000 UTC m=+1087.969939761" lastFinishedPulling="2026-03-20 11:12:49.677254394 +0000 UTC m=+1093.898615292" observedRunningTime="2026-03-20 11:12:49.916859732 +0000 UTC m=+1094.138220630" watchObservedRunningTime="2026-03-20 11:12:49.923056279 +0000 UTC m=+1094.144417177" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.689199 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.691149 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.698373 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699299 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-t9rhk" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699704 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699968 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.930198 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.930335 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.031940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.032007 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.055790 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.056121 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.070839 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.814912 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.949833 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" event={"ID":"4587c778-c12c-48e0-8c28-7eb7a7c1b722","Type":"ContainerStarted","Data":"8947d1ea38c8e3caa32ce4b6e72203473c07b19f5cb48d1e6d89ceeff94d1432"} Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.095060 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.097136 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.099205 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-96spt" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.111017 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.185533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.185700 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.289296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.289423 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.312299 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.316481 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.423645 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:13:01 crc kubenswrapper[4860]: I0320 11:13:01.222741 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.016953 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" event={"ID":"4587c778-c12c-48e0-8c28-7eb7a7c1b722","Type":"ContainerStarted","Data":"1f5b7359ddb2c5466b2f230e8d4d31df30245fc54e64e469a4cc280ec6c1c5a0"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.017577 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.018676 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" event={"ID":"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619","Type":"ContainerStarted","Data":"4a0264fae7cc9a8763d85efc390fbe6ebb3b72c07a1bb87615bb1ce1fa205e23"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.018729 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" event={"ID":"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619","Type":"ContainerStarted","Data":"c7661c1d441f146549021fcf912401cd216dbe9fb176214ef73ea97f9d148b5d"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.041268 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" podStartSLOduration=2.785166915 podStartE2EDuration="10.041243074s" podCreationTimestamp="2026-03-20 11:12:52 +0000 UTC" firstStartedPulling="2026-03-20 11:12:53.825494805 +0000 UTC m=+1098.046855703" lastFinishedPulling="2026-03-20 11:13:01.081570964 +0000 UTC m=+1105.302931862" observedRunningTime="2026-03-20 11:13:02.03556865 +0000 UTC m=+1106.256929548" watchObservedRunningTime="2026-03-20 11:13:02.041243074 +0000 UTC m=+1106.262603972" Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.056396 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" podStartSLOduration=5.056375593 podStartE2EDuration="5.056375593s" podCreationTimestamp="2026-03-20 11:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:13:02.052829957 +0000 UTC m=+1106.274190855" watchObservedRunningTime="2026-03-20 11:13:02.056375593 +0000 UTC m=+1106.277736481" Mar 20 11:13:08 crc kubenswrapper[4860]: I0320 11:13:08.075349 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.466670 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.468678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.478040 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wq9fk" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.479576 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.582532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.582613 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.684190 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.684261 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.706488 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.706735 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.802212 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:13 crc kubenswrapper[4860]: I0320 11:13:13.189243 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:13 crc kubenswrapper[4860]: W0320 11:13:13.195907 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74ed19c1_0e46_4fed_b50f_155eaa38aed9.slice/crio-f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa WatchSource:0}: Error finding container f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa: Status 404 returned error can't find the container with id f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.113255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x6lwp" event={"ID":"74ed19c1-0e46-4fed-b50f-155eaa38aed9","Type":"ContainerStarted","Data":"ce901f928ecd236de2e6dd1faf0bd6e2b197f0a6a58684993d4d5230170700fa"} Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.113738 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x6lwp" event={"ID":"74ed19c1-0e46-4fed-b50f-155eaa38aed9","Type":"ContainerStarted","Data":"f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa"} Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.136702 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-x6lwp" podStartSLOduration=2.136677417 podStartE2EDuration="2.136677417s" podCreationTimestamp="2026-03-20 11:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:13:14.132452753 +0000 UTC m=+1118.353813661" watchObservedRunningTime="2026-03-20 11:13:14.136677417 +0000 UTC m=+1118.358038325" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.178218 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.180169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.185617 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.185727 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cz6kh" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.190395 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.203914 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.328250 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.429861 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.459458 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.502286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.988717 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:22 crc kubenswrapper[4860]: I0320 11:13:22.167216 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerStarted","Data":"e67e25cb07cccf98304aadd710a0423c4f122b21342a65a9b09ad9931a4c05cc"} Mar 20 11:13:24 crc kubenswrapper[4860]: I0320 11:13:24.555509 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.175538 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.176616 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.184262 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.196732 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.297739 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.331047 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.508173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.222358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerStarted","Data":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.222589 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mhn62" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" containerID="cri-o://b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" gracePeriod=2 Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.245193 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mhn62" podStartSLOduration=1.279419508 podStartE2EDuration="5.245156614s" podCreationTimestamp="2026-03-20 11:13:21 +0000 UTC" firstStartedPulling="2026-03-20 11:13:21.999965046 +0000 UTC m=+1126.221325944" lastFinishedPulling="2026-03-20 11:13:25.965702152 +0000 UTC m=+1130.187063050" observedRunningTime="2026-03-20 11:13:26.24351919 +0000 UTC m=+1130.464880088" watchObservedRunningTime="2026-03-20 11:13:26.245156614 +0000 UTC m=+1130.466517512" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.271380 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:26 crc kubenswrapper[4860]: W0320 11:13:26.319871 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7193309_39f9_4487_b02b_8e9e4d6a69ff.slice/crio-356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd WatchSource:0}: Error finding container 356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd: Status 404 returned error can't find the container with id 356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.610824 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mhn62_d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/registry-server/0.log" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.610924 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.723256 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.731138 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j" (OuterVolumeSpecName: "kube-api-access-xgg5j") pod "d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" (UID: "d2f28a99-89df-4152-a8d5-ddf3a8f3edcd"). InnerVolumeSpecName "kube-api-access-xgg5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.825050 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.232385 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-82h6r" event={"ID":"f7193309-39f9-4487-b02b-8e9e4d6a69ff","Type":"ContainerStarted","Data":"d9e773f56fb45b1e2eca225ac2751b5f52b755312b96694cabce8aa90553d8fb"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.232497 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-82h6r" event={"ID":"f7193309-39f9-4487-b02b-8e9e4d6a69ff","Type":"ContainerStarted","Data":"356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234787 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mhn62_d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/registry-server/0.log" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234844 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" exitCode=2 Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerDied","Data":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234936 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerDied","Data":"e67e25cb07cccf98304aadd710a0423c4f122b21342a65a9b09ad9931a4c05cc"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.235005 4860 scope.go:117] "RemoveContainer" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.235155 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.251843 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-82h6r" podStartSLOduration=2.134626385 podStartE2EDuration="2.251821296s" podCreationTimestamp="2026-03-20 11:13:25 +0000 UTC" firstStartedPulling="2026-03-20 11:13:26.32295121 +0000 UTC m=+1130.544312108" lastFinishedPulling="2026-03-20 11:13:26.440146121 +0000 UTC m=+1130.661507019" observedRunningTime="2026-03-20 11:13:27.249966506 +0000 UTC m=+1131.471327404" watchObservedRunningTime="2026-03-20 11:13:27.251821296 +0000 UTC m=+1131.473182194" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.257921 4860 scope.go:117] "RemoveContainer" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: E0320 11:13:27.258593 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": container with ID starting with b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d not found: ID does not exist" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.258635 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} err="failed to get container status \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": rpc error: code = NotFound desc = could not find container \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": container with ID starting with b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d not found: ID does not exist" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.275028 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.280031 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.431834 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" path="/var/lib/kubelet/pods/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/volumes" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.508401 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.509022 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.551102 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:36 crc kubenswrapper[4860]: I0320 11:13:36.330389 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002073 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: E0320 11:13:38.002650 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002664 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002790 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.003656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.005965 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d44nt" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.018731 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191210 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191570 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191888 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292641 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292738 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292808 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.293354 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.293819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.316502 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.322756 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.879990 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: W0320 11:13:38.889432 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6145c03_dfdd_4224_b2b0_6087b1f137d1.slice/crio-1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b WatchSource:0}: Error finding container 1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b: Status 404 returned error can't find the container with id 1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.330730 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="e8129bf6a5c7e4443bc4223ef31160ba924a8902cead5ce3280ed2b69c93300d" exitCode=0 Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.330903 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"e8129bf6a5c7e4443bc4223ef31160ba924a8902cead5ce3280ed2b69c93300d"} Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.331362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerStarted","Data":"1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b"} Mar 20 11:13:40 crc kubenswrapper[4860]: I0320 11:13:40.343171 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="75afa6cb74a1b94e312b3e23dce2a78c99ad6b105874c3714debba4f94c0a1d0" exitCode=0 Mar 20 11:13:40 crc kubenswrapper[4860]: I0320 11:13:40.343259 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"75afa6cb74a1b94e312b3e23dce2a78c99ad6b105874c3714debba4f94c0a1d0"} Mar 20 11:13:41 crc kubenswrapper[4860]: I0320 11:13:41.354493 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="bad9fff9bfd0e2323ffe61ba5a0146235093ea5f58bedbdcab77cef495d5e5dd" exitCode=0 Mar 20 11:13:41 crc kubenswrapper[4860]: I0320 11:13:41.354999 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"bad9fff9bfd0e2323ffe61ba5a0146235093ea5f58bedbdcab77cef495d5e5dd"} Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.767666 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967612 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967731 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.968758 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle" (OuterVolumeSpecName: "bundle") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.975302 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj" (OuterVolumeSpecName: "kube-api-access-kk2mj") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "kube-api-access-kk2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.981600 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util" (OuterVolumeSpecName: "util") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069299 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069379 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069392 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.372642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b"} Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.373199 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.372734 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.617981 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618914 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="util" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618929 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="util" Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618937 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618943 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618954 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="pull" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618960 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="pull" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.619084 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.619693 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.622768 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-mz62q" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.649887 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.682168 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.783698 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.807625 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.944450 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:51 crc kubenswrapper[4860]: I0320 11:13:51.489341 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.344488 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.345070 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.442119 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" event={"ID":"31f3fcff-ca2c-40b5-bdf3-018132ccb63b","Type":"ContainerStarted","Data":"26ca22ffb488640010b32159f489f1a42073cd419130b6c1debff8c9062f4330"} Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.497789 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" event={"ID":"31f3fcff-ca2c-40b5-bdf3-018132ccb63b","Type":"ContainerStarted","Data":"c688ae61d3d7e01c8c97058657caa3a3046a09fe5010c8fc985b85eda14c5cbf"} Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.499592 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.534119 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" podStartSLOduration=1.726034311 podStartE2EDuration="7.534093882s" podCreationTimestamp="2026-03-20 11:13:50 +0000 UTC" firstStartedPulling="2026-03-20 11:13:51.4939245 +0000 UTC m=+1155.715285398" lastFinishedPulling="2026-03-20 11:13:57.301984071 +0000 UTC m=+1161.523344969" observedRunningTime="2026-03-20 11:13:57.532312544 +0000 UTC m=+1161.753673462" watchObservedRunningTime="2026-03-20 11:13:57.534093882 +0000 UTC m=+1161.755454780" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.137898 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.139307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150196 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150196 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150566 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.151882 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.238505 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.339911 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.370557 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.477527 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.740797 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: W0320 11:14:00.759437 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90da115_522c_4858_935f_7d4a7211c8cb.slice/crio-7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e WatchSource:0}: Error finding container 7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e: Status 404 returned error can't find the container with id 7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e Mar 20 11:14:01 crc kubenswrapper[4860]: I0320 11:14:01.527921 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerStarted","Data":"7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e"} Mar 20 11:14:06 crc kubenswrapper[4860]: I0320 11:14:06.564650 4860 generic.go:334] "Generic (PLEG): container finished" podID="a90da115-522c-4858-935f-7d4a7211c8cb" containerID="f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8" exitCode=0 Mar 20 11:14:06 crc kubenswrapper[4860]: I0320 11:14:06.565120 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerDied","Data":"f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8"} Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.878002 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.956618 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"a90da115-522c-4858-935f-7d4a7211c8cb\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.964415 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b" (OuterVolumeSpecName: "kube-api-access-sbk9b") pod "a90da115-522c-4858-935f-7d4a7211c8cb" (UID: "a90da115-522c-4858-935f-7d4a7211c8cb"). InnerVolumeSpecName "kube-api-access-sbk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.058155 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581792 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerDied","Data":"7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e"} Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581847 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581856 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.932736 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.941136 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:14:09 crc kubenswrapper[4860]: I0320 11:14:09.422914 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" path="/var/lib/kubelet/pods/d05f5e64-f0ec-45f9-a491-7dde7bdf6538/volumes" Mar 20 11:14:10 crc kubenswrapper[4860]: I0320 11:14:10.948911 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:14:22 crc kubenswrapper[4860]: I0320 11:14:22.344616 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:22 crc kubenswrapper[4860]: I0320 11:14:22.345451 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.354321 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.355668 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.355684 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.355804 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.356380 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.359823 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-g929m" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.360384 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.365339 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.367771 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-p9jpf" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.375156 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.382820 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.447411 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.448531 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.451284 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tmw4g" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.460736 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.460879 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.478671 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.479987 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.489495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-k9fnc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.489745 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.490944 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.493859 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-t8v9p" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.503386 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.519757 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.541309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.556416 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.557623 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562066 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562175 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562271 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562300 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.563590 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gqpfd" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.566261 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.593319 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.594988 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.602203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.608627 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m7gfz" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.613716 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.627753 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.629769 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.637757 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.638426 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.639434 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8t2wr" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.649283 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.650291 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.655402 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fb2fd" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.660470 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663567 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663642 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663717 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.701337 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.704111 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.739867 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.741188 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.742390 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.744307 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-fkbb7" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.746106 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.753516 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.761300 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765475 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765540 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765691 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765721 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765742 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765827 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.854676 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.855032 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.861376 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.863984 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.961276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.962503 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968460 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968537 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968712 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968753 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968875 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.969455 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.969526 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:30.469502745 +0000 UTC m=+1194.690863643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.969970 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b46m4" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.998064 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.001332 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.003173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.008174 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mhqhl" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.028475 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.039250 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.048096 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.056365 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.057662 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.062489 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jxfnd" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.068288 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070722 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070803 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070824 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.071070 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.071588 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.080413 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z77q2" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.093550 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.109309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.109377 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.112560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.116089 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.117093 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p7g4n" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.117200 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.118021 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.125050 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.125546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.129211 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k6th9" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.153039 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172376 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172426 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172485 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172507 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172535 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172578 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.201310 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.202600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.221315 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5pnn4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.225247 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.230145 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.243300 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.260740 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276486 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276580 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276702 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276735 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276767 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.277519 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.277574 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:30.777558421 +0000 UTC m=+1194.998919319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.296825 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.311975 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.317152 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.317815 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hz52j" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.358902 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.361819 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.364080 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.364868 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.372487 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.372643 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.380011 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.380120 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.388515 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.400946 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.402032 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.419690 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.426539 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.427736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.433965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.437637 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.439291 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-75zw2" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.441408 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.449208 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.457040 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zljps" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.493949 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.495349 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.495527 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.496277 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.496431 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.496412824 +0000 UTC m=+1195.717773712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.497218 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.529047 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.537551 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.542093 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.553429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.556105 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l9f4c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.614158 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.617484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.618898 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.619025 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.619106 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.691173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721253 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721358 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721436 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.773706 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.785000 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.821049 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.823360 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.823551 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.823605 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.823589188 +0000 UTC m=+1196.044950086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.839883 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.841128 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.851801 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rz2ll" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.852039 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.853869 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.869303 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.878173 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924491 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924677 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.026475 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.026999 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.026691 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.027055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027107 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.527085434 +0000 UTC m=+1195.748446332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027207 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027300 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.52727961 +0000 UTC m=+1195.748640508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.077147 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.108350 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.174545 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567197 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567378 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567433 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.567626 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.567699 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:32.567676334 +0000 UTC m=+1196.789037232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568262 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568301 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:32.56829061 +0000 UTC m=+1196.789651518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568377 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568406 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:33.568396873 +0000 UTC m=+1197.789757771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.875173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.876719 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.877443 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:33.877392735 +0000 UTC m=+1198.098753633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.101691 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.121299 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.146917 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.200044 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.266441 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d35dc6_0fc2_4651_9dcd_855814132a5f.slice/crio-2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b WatchSource:0}: Error finding container 2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b: Status 404 returned error can't find the container with id 2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.272670 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.274098 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36138670_7449_4d49_8a23_73b57d10b67f.slice/crio-653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f WatchSource:0}: Error finding container 653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f: Status 404 returned error can't find the container with id 653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.286548 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.302561 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce5926a_9df6_4915_a94f_02cf2f74fccc.slice/crio-ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60 WatchSource:0}: Error finding container ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60: Status 404 returned error can't find the container with id ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60 Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.304252 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf57205_3b95_48a3_8222_1b57b0b6c54b.slice/crio-e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e WatchSource:0}: Error finding container e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e: Status 404 returned error can't find the container with id e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.306809 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f73053a_86aa_42dc_bcca_ee26a4fda2e5.slice/crio-37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084 WatchSource:0}: Error finding container 37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084: Status 404 returned error can't find the container with id 37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084 Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.309890 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.315858 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hq7gw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-4tdg4_openstack-operators(7f73053a-86aa-42dc-bcca-ee26a4fda2e5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.317146 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.318168 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.326590 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.335048 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.350281 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.489203 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.517573 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.552181 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ph84b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-jd9bn_openstack-operators(b5e881e2-f657-418f-ba87-7074722307a2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.553590 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.558919 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7pvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-tjt52_openstack-operators(431ab970-7f36-4ace-860c-479faac092a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560151 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99lpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-m8948_openstack-operators(d7202366-6dc1-45ca-bb9a-74bdd0426c5f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560267 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dzznl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-z8fp5_openstack-operators(6c2530cf-70b4-4a89-acff-086b36773edf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560357 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.561552 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.561632 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.570785 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.580198 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.587849 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.595608 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.608986 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.609078 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610356 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610436 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:34.61041172 +0000 UTC m=+1198.831772618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610631 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610745 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:34.610720589 +0000 UTC m=+1198.832081487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.651878 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.659029 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9jc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-pq75b_openstack-operators(fbbe8243-9afb-4fc5-90f1-04d6f0c074ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.659150 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhllj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-ncmzn_openstack-operators(1723efcf-97d7-4101-a15d-d4776d45d29b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.660353 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.660452 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.666736 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.996691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" event={"ID":"cce5926a-9df6-4915-a94f-02cf2f74fccc","Type":"ContainerStarted","Data":"ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60"} Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.998865 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" event={"ID":"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef","Type":"ContainerStarted","Data":"259ac670c13cada7887f8ddd23af8e8f420390512c179c02aa9d62ec42d77c53"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.001445 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.004424 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" event={"ID":"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b","Type":"ContainerStarted","Data":"57238a829c24483edc7b7f4ba4a3f4af0b1f67a21336c58eb92c49353e2913e0"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.005860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" event={"ID":"29801d0c-963e-4b38-ad2d-8b03d3ade0be","Type":"ContainerStarted","Data":"d1b61d4bc46f0fc8a3dcdadce15cdaacd80fb80faeea6db2b60ecbb47c06435d"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.008618 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" event={"ID":"b5e881e2-f657-418f-ba87-7074722307a2","Type":"ContainerStarted","Data":"23e2e5a1441c2cc89cc776a76142fcf6619b8f62fd5add6a8a4e9bce0a9588c2"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.010812 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.012663 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" event={"ID":"178fff2d-699c-4cab-8626-3e30a6bd9ed6","Type":"ContainerStarted","Data":"7d1d04346c0229f6683d4f3f87b8356f2be76cf66c4debdd377435df577f13ab"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.017115 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" event={"ID":"c736e6d7-6806-4ef3-a0b3-f1b17ab33037","Type":"ContainerStarted","Data":"855e5b3d56507d09b7fb79e340a42bff1527e6ec7b7c44f650fa33a7e4d296d7"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.033113 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" event={"ID":"acf57205-3b95-48a3-8222-1b57b0b6c54b","Type":"ContainerStarted","Data":"e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.035428 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" event={"ID":"6c2530cf-70b4-4a89-acff-086b36773edf","Type":"ContainerStarted","Data":"f9cb0b7f58056f0f34d7bf252bd1a03483971006e58fb26b7f71b9cbc4cd8a1e"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.043604 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.051929 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" event={"ID":"431ab970-7f36-4ace-860c-479faac092a0","Type":"ContainerStarted","Data":"1d7e414246e6b5722ed876b7bc9372effb5108268900a8295c5f131fdb837104"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.053866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" event={"ID":"8b4d2530-4f67-45e8-9444-bea25fdad6ae","Type":"ContainerStarted","Data":"7e09276a073074ff1b26a9e545b83e81d4d87c1adb1d28bbd3bde3edf96ac0b3"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.055184 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.056052 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" event={"ID":"20d35dc6-0fc2-4651-9dcd-855814132a5f","Type":"ContainerStarted","Data":"2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.057739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" event={"ID":"36138670-7449-4d49-8a23-73b57d10b67f","Type":"ContainerStarted","Data":"653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.059957 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" event={"ID":"0fe9b978-da91-4568-9b77-0d5930aca888","Type":"ContainerStarted","Data":"b10464282f54fb9a13c150e1ad56528f1ef54ca6ca0e893d367b6788861ad279"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.066928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" event={"ID":"f329ab6d-5c8c-4ed2-a830-d0a04bb31071","Type":"ContainerStarted","Data":"52701df3a58a4615ad9d527ff7cb91a8105f07c2e9cf7e70925c42479dfdfa8a"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.072535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" event={"ID":"d7202366-6dc1-45ca-bb9a-74bdd0426c5f","Type":"ContainerStarted","Data":"3a7e6c9120211a60f9c0ff50ac9a9f7a16cc3d0c7d24b0931885e87c80cfd819"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.075482 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.077371 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" event={"ID":"c54f27c4-bd61-4bad-bf91-376fee65d219","Type":"ContainerStarted","Data":"27e0f4781c13dca181650d9f15d4a4b37f69875aa55ee0c9a9a2b76706b51bfb"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.079602 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" event={"ID":"7f73053a-86aa-42dc-bcca-ee26a4fda2e5","Type":"ContainerStarted","Data":"37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.086653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" event={"ID":"1723efcf-97d7-4101-a15d-d4776d45d29b","Type":"ContainerStarted","Data":"42af6d544fb4a25beae254e15fa4f1f8fffbfa86ae5c09d56b99174d27dafbb1"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.087620 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.088440 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.629362 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.630095 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.630154 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:37.630138365 +0000 UTC m=+1201.851499263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.952811 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.953115 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.953181 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:37.953163567 +0000 UTC m=+1202.174524465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.104801 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.104798 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105431 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105505 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105550 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.108428 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.108625 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:34 crc kubenswrapper[4860]: I0320 11:14:34.668339 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:34 crc kubenswrapper[4860]: I0320 11:14:34.668421 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.668626 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.668699 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:38.668674449 +0000 UTC m=+1202.890035347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.669179 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.669213 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:38.669202664 +0000 UTC m=+1202.890563562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: I0320 11:14:37.656523 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.657020 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.657495 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:45.657339845 +0000 UTC m=+1209.878700743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: I0320 11:14:37.961419 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.961680 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.961803 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:45.961777504 +0000 UTC m=+1210.183138402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: I0320 11:14:38.672281 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:38 crc kubenswrapper[4860]: I0320 11:14:38.672369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672529 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672574 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672639 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:46.67261315 +0000 UTC m=+1210.893974048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672664 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:46.672653281 +0000 UTC m=+1210.894014269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:41 crc kubenswrapper[4860]: I0320 11:14:41.524200 4860 scope.go:117] "RemoveContainer" containerID="478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.745178 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.745408 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgkkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-zphz9_openstack-operators(5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.746601 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podUID="5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b" Mar 20 11:14:45 crc kubenswrapper[4860]: E0320 11:14:45.209785 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podUID="5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.744768 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.763651 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.881882 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m7gfz" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.888907 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.048113 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.048411 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.048524 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.048501618 +0000 UTC m=+1226.269862516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.759640 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.759864 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.759928 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760036 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760054 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.760022392 +0000 UTC m=+1226.981383300 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760181 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.760152375 +0000 UTC m=+1226.981513423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.926513 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.926734 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pmzc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-wfczk_openstack-operators(c54f27c4-bd61-4bad-bf91-376fee65d219): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.927927 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podUID="c54f27c4-bd61-4bad-bf91-376fee65d219" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.226446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podUID="c54f27c4-bd61-4bad-bf91-376fee65d219" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.728620 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.728834 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nkfnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-2692b_openstack-operators(20d35dc6-0fc2-4651-9dcd-855814132a5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.731875 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podUID="20d35dc6-0fc2-4651-9dcd-855814132a5f" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.247111 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" event={"ID":"acf57205-3b95-48a3-8222-1b57b0b6c54b","Type":"ContainerStarted","Data":"c4b708f6c8bd7e1d8e843e24fefa9075c80eab27f8abcba486675fbbef953e17"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.247548 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.250448 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.260767 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" event={"ID":"f329ab6d-5c8c-4ed2-a830-d0a04bb31071","Type":"ContainerStarted","Data":"8774d2ddb9394da732436a583f367fb0559b05e79ed51417ba4b8379d19eccce"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.260942 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.263931 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" event={"ID":"8b4d2530-4f67-45e8-9444-bea25fdad6ae","Type":"ContainerStarted","Data":"6053b33dfa6413e5e99e79e183c60e20067618b574ae56cd39f2373a9f9e4256"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.264500 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.283745 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.289689 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" podStartSLOduration=3.771780658 podStartE2EDuration="19.289667936s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.306944029 +0000 UTC m=+1196.528304927" lastFinishedPulling="2026-03-20 11:14:47.824831307 +0000 UTC m=+1212.046192205" observedRunningTime="2026-03-20 11:14:48.280887688 +0000 UTC m=+1212.502248586" watchObservedRunningTime="2026-03-20 11:14:48.289667936 +0000 UTC m=+1212.511028834" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.299345 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.308840 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" podStartSLOduration=3.002975794 podStartE2EDuration="18.308816674s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.527130247 +0000 UTC m=+1196.748491145" lastFinishedPulling="2026-03-20 11:14:47.832971127 +0000 UTC m=+1212.054332025" observedRunningTime="2026-03-20 11:14:48.303773358 +0000 UTC m=+1212.525134266" watchObservedRunningTime="2026-03-20 11:14:48.308816674 +0000 UTC m=+1212.530177572" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.319526 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" event={"ID":"178fff2d-699c-4cab-8626-3e30a6bd9ed6","Type":"ContainerStarted","Data":"cffbad57ee8dff1bab6dc6744d40cffde1e9ed1ee59d24cab24773ff0852cacd"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.319694 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:48 crc kubenswrapper[4860]: E0320 11:14:48.322321 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podUID="20d35dc6-0fc2-4651-9dcd-855814132a5f" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.343277 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" podStartSLOduration=3.728024254 podStartE2EDuration="19.343247836s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.186300624 +0000 UTC m=+1196.407661522" lastFinishedPulling="2026-03-20 11:14:47.801524206 +0000 UTC m=+1212.022885104" observedRunningTime="2026-03-20 11:14:48.332015372 +0000 UTC m=+1212.553376260" watchObservedRunningTime="2026-03-20 11:14:48.343247836 +0000 UTC m=+1212.564608734" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.363943 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" podStartSLOduration=3.857474597 podStartE2EDuration="19.363917755s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.305534471 +0000 UTC m=+1196.526895369" lastFinishedPulling="2026-03-20 11:14:47.811977629 +0000 UTC m=+1212.033338527" observedRunningTime="2026-03-20 11:14:48.362632561 +0000 UTC m=+1212.583993459" watchObservedRunningTime="2026-03-20 11:14:48.363917755 +0000 UTC m=+1212.585278653" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.442144 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" podStartSLOduration=3.800345561 podStartE2EDuration="19.442121492s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.15437111 +0000 UTC m=+1196.375732008" lastFinishedPulling="2026-03-20 11:14:47.796147041 +0000 UTC m=+1212.017507939" observedRunningTime="2026-03-20 11:14:48.439160572 +0000 UTC m=+1212.660521470" watchObservedRunningTime="2026-03-20 11:14:48.442121492 +0000 UTC m=+1212.663482390" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.493998 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" podStartSLOduration=3.0036312020000002 podStartE2EDuration="18.493962244s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.305765027 +0000 UTC m=+1196.527125915" lastFinishedPulling="2026-03-20 11:14:47.796096069 +0000 UTC m=+1212.017456957" observedRunningTime="2026-03-20 11:14:48.476211044 +0000 UTC m=+1212.697571942" watchObservedRunningTime="2026-03-20 11:14:48.493962244 +0000 UTC m=+1212.715323142" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.332079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" event={"ID":"29801d0c-963e-4b38-ad2d-8b03d3ade0be","Type":"ContainerStarted","Data":"6690b66690c5cf1349d47f82f23e3519efd44d7ad7ae0c4d33aab1d25cd9beef"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.332489 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.337152 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" event={"ID":"70703379-8eb2-4f8a-95c8-302b53692a53","Type":"ContainerStarted","Data":"3d3f218cca6614ef4443974bf3189504e2f4dad8910e7ade3a11f253b6812c7a"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.347681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" event={"ID":"cce5926a-9df6-4915-a94f-02cf2f74fccc","Type":"ContainerStarted","Data":"f1a8ff33fc67fd46243a472e2c23e9c942bf5c189453732627de51362450ceac"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436239 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" event={"ID":"36138670-7449-4d49-8a23-73b57d10b67f","Type":"ContainerStarted","Data":"b8796b669a8fb9a766536a6eb3d810510e2e840db450b7dbb1246df0ffe345db"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436317 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" event={"ID":"c736e6d7-6806-4ef3-a0b3-f1b17ab33037","Type":"ContainerStarted","Data":"33069470f44998b1b24652c7acc1a1f10de7a07c9c01a765ab5fba90c657622d"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436353 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" event={"ID":"0fe9b978-da91-4568-9b77-0d5930aca888","Type":"ContainerStarted","Data":"cb97551383587fa53efc65e44c647e6c20c4eefcc6915e80b778c246bb71b8bb"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.440094 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" podStartSLOduration=4.721136719 podStartE2EDuration="20.440075508s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.115675773 +0000 UTC m=+1196.337036671" lastFinishedPulling="2026-03-20 11:14:47.834614562 +0000 UTC m=+1212.055975460" observedRunningTime="2026-03-20 11:14:49.366669951 +0000 UTC m=+1213.588030859" watchObservedRunningTime="2026-03-20 11:14:49.440075508 +0000 UTC m=+1213.661436406" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.449954 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" podStartSLOduration=4.92925423 podStartE2EDuration="20.449933544s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.279205238 +0000 UTC m=+1196.500566136" lastFinishedPulling="2026-03-20 11:14:47.799884552 +0000 UTC m=+1212.021245450" observedRunningTime="2026-03-20 11:14:49.439360618 +0000 UTC m=+1213.660721526" watchObservedRunningTime="2026-03-20 11:14:49.449933544 +0000 UTC m=+1213.671294442" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.471289 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" podStartSLOduration=5.159443949 podStartE2EDuration="20.471256951s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.522633185 +0000 UTC m=+1196.743994083" lastFinishedPulling="2026-03-20 11:14:47.834446187 +0000 UTC m=+1212.055807085" observedRunningTime="2026-03-20 11:14:49.464837848 +0000 UTC m=+1213.686198766" watchObservedRunningTime="2026-03-20 11:14:49.471256951 +0000 UTC m=+1213.692617849" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.344965 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.345568 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.345633 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.346473 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.346540 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" gracePeriod=600 Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480096 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" exitCode=0 Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480142 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480724 4860 scope.go:117] "RemoveContainer" containerID="8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.747502 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.759293 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.857659 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.154786 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.156360 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.161258 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.161326 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.169366 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.263191 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289793 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289840 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.363132 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.369280 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392050 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392190 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392346 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.396786 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.405433 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.415060 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.489816 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.502097 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.695719 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:15:01 crc kubenswrapper[4860]: I0320 11:15:01.112131 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.120092 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.136345 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.166882 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.167142 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99lpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-m8948_openstack-operators(d7202366-6dc1-45ca-bb9a-74bdd0426c5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.168422 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.332878 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k6th9" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.340993 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.831508 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.832412 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.839974 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.844988 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.856955 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rz2ll" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.864535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.395357 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.486183 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.524587 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.616650 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" event={"ID":"70703379-8eb2-4f8a-95c8-302b53692a53","Type":"ContainerStarted","Data":"c0d6274ac4037eb1f88e930a94b737d0a2f092be4b1c7367386f1ceb6cc01ced"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.616824 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.618063 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" event={"ID":"1723efcf-97d7-4101-a15d-d4776d45d29b","Type":"ContainerStarted","Data":"44ec7ea3c410da47fcddc13a213fd5b93220b87f7aca26a891c5672788b46187"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.619011 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:15:04 crc kubenswrapper[4860]: W0320 11:15:04.619777 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84431296_0ca0_425a_8da8_c3ea46b08b29.slice/crio-97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d WatchSource:0}: Error finding container 97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d: Status 404 returned error can't find the container with id 97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.623979 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" event={"ID":"b5e881e2-f657-418f-ba87-7074722307a2","Type":"ContainerStarted","Data":"561d50dea49b7501b2c0729c0aad9aa1605b8f1c2182c02db2a03fad6dc3dbd2"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.624976 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.634855 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" event={"ID":"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b","Type":"ContainerStarted","Data":"cda8cceec74b6fb38d0cd7ae8613b4b2d45dde15d0a48b166b8c01053c6539f1"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.635200 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.642264 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" event={"ID":"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef","Type":"ContainerStarted","Data":"ba26b0323463584a603cb064561889cf3fb5b72612fc2681710c544b1496e206"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.643364 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.652301 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" event={"ID":"c54f27c4-bd61-4bad-bf91-376fee65d219","Type":"ContainerStarted","Data":"a224617de695744e3935046ee8fa401c32b3ae077622c9c12a3420884f019f7c"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.652719 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.658088 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" podStartSLOduration=20.479190485 podStartE2EDuration="35.65805975s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:48.324477078 +0000 UTC m=+1212.545837976" lastFinishedPulling="2026-03-20 11:15:03.503346343 +0000 UTC m=+1227.724707241" observedRunningTime="2026-03-20 11:15:04.646397514 +0000 UTC m=+1228.867758422" watchObservedRunningTime="2026-03-20 11:15:04.65805975 +0000 UTC m=+1228.879420648" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.690791 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.695550 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podStartSLOduration=3.744695935 podStartE2EDuration="34.695525464s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.551978219 +0000 UTC m=+1196.773339117" lastFinishedPulling="2026-03-20 11:15:03.502807738 +0000 UTC m=+1227.724168646" observedRunningTime="2026-03-20 11:15:04.691814133 +0000 UTC m=+1228.913175031" watchObservedRunningTime="2026-03-20 11:15:04.695525464 +0000 UTC m=+1228.916886362" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.707400 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" event={"ID":"6c2530cf-70b4-4a89-acff-086b36773edf","Type":"ContainerStarted","Data":"d2991549d097e0f8e60163dd8347ebd592f21a708b6bf45559157b0629e5b224"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.708376 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.712780 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" event={"ID":"431ab970-7f36-4ace-860c-479faac092a0","Type":"ContainerStarted","Data":"a7267c3f93af9a30407e1ba678b7a89c0f8d15d671aa4a7d18f3ff4922e4d82e"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.713652 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.714958 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerStarted","Data":"1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.722131 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" event={"ID":"7f73053a-86aa-42dc-bcca-ee26a4fda2e5","Type":"ContainerStarted","Data":"d85ed8e6e34fbe2b90820068da762f81b3aa62920cd63e30346924dcf18ff7a4"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.724473 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.731097 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" event={"ID":"ecf64e38-138d-4ef7-8b17-c09f30358f3e","Type":"ContainerStarted","Data":"0a29df98ab6deaab732d4563afe2030291420ed0b8c860be68c2c0bbe6b53dfd"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.745632 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podStartSLOduration=4.423815212 podStartE2EDuration="35.745599329s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.296824015 +0000 UTC m=+1196.518184913" lastFinishedPulling="2026-03-20 11:15:03.618608122 +0000 UTC m=+1227.839969030" observedRunningTime="2026-03-20 11:15:04.728128096 +0000 UTC m=+1228.949488994" watchObservedRunningTime="2026-03-20 11:15:04.745599329 +0000 UTC m=+1228.966960227" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.795583 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podStartSLOduration=4.807159155 podStartE2EDuration="35.795553801s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.658135522 +0000 UTC m=+1196.879496420" lastFinishedPulling="2026-03-20 11:15:03.646530168 +0000 UTC m=+1227.867891066" observedRunningTime="2026-03-20 11:15:04.749616417 +0000 UTC m=+1228.970977315" watchObservedRunningTime="2026-03-20 11:15:04.795553801 +0000 UTC m=+1229.016914699" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.840198 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podStartSLOduration=3.993839366 podStartE2EDuration="34.840174798s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.658957094 +0000 UTC m=+1196.880317992" lastFinishedPulling="2026-03-20 11:15:03.505292526 +0000 UTC m=+1227.726653424" observedRunningTime="2026-03-20 11:15:04.814414001 +0000 UTC m=+1229.035774899" watchObservedRunningTime="2026-03-20 11:15:04.840174798 +0000 UTC m=+1229.061535696" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.849578 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podStartSLOduration=4.905412624 podStartE2EDuration="35.849557312s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.558622529 +0000 UTC m=+1196.779983437" lastFinishedPulling="2026-03-20 11:15:03.502767217 +0000 UTC m=+1227.724128125" observedRunningTime="2026-03-20 11:15:04.838556244 +0000 UTC m=+1229.059917142" watchObservedRunningTime="2026-03-20 11:15:04.849557312 +0000 UTC m=+1229.070918210" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.884488 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podStartSLOduration=4.69685691 podStartE2EDuration="35.884456816s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.315661745 +0000 UTC m=+1196.537022643" lastFinishedPulling="2026-03-20 11:15:03.503261651 +0000 UTC m=+1227.724622549" observedRunningTime="2026-03-20 11:15:04.871993439 +0000 UTC m=+1229.093354337" watchObservedRunningTime="2026-03-20 11:15:04.884456816 +0000 UTC m=+1229.105817714" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.038213 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podStartSLOduration=4.735092375 podStartE2EDuration="36.038181596s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.313844116 +0000 UTC m=+1196.535205014" lastFinishedPulling="2026-03-20 11:15:03.616933337 +0000 UTC m=+1227.838294235" observedRunningTime="2026-03-20 11:15:04.976693502 +0000 UTC m=+1229.198054410" watchObservedRunningTime="2026-03-20 11:15:05.038181596 +0000 UTC m=+1229.259542494" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.047719 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podStartSLOduration=4.989271734 podStartE2EDuration="36.047689614s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.560094229 +0000 UTC m=+1196.781455127" lastFinishedPulling="2026-03-20 11:15:03.618512109 +0000 UTC m=+1227.839873007" observedRunningTime="2026-03-20 11:15:05.0216802 +0000 UTC m=+1229.243041098" watchObservedRunningTime="2026-03-20 11:15:05.047689614 +0000 UTC m=+1229.269050512" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.740095 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" event={"ID":"20d35dc6-0fc2-4651-9dcd-855814132a5f","Type":"ContainerStarted","Data":"ccaba55bb65ff754bdc14a93c8613b20a19800a38989d934340a69d51a04a280"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.741677 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745067 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" event={"ID":"84431296-0ca0-425a-8da8-c3ea46b08b29","Type":"ContainerStarted","Data":"12a84276fddb15a4fc74dbf9612e90fceec66679e5eacfe37bd7258933168d14"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745100 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" event={"ID":"84431296-0ca0-425a-8da8-c3ea46b08b29","Type":"ContainerStarted","Data":"97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745696 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.749936 4860 generic.go:334] "Generic (PLEG): container finished" podID="20271235-6d5c-451f-a889-725d0b95503e" containerID="16689897d8bfeb9aac52fd534664320896c6183e3c8e9d6ce4861a8cab7d6c12" exitCode=0 Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.750678 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerDied","Data":"16689897d8bfeb9aac52fd534664320896c6183e3c8e9d6ce4861a8cab7d6c12"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.781699 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podStartSLOduration=5.010222409 podStartE2EDuration="36.781676294s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.275307413 +0000 UTC m=+1196.496668311" lastFinishedPulling="2026-03-20 11:15:04.046761298 +0000 UTC m=+1228.268122196" observedRunningTime="2026-03-20 11:15:05.774564991 +0000 UTC m=+1229.995925889" watchObservedRunningTime="2026-03-20 11:15:05.781676294 +0000 UTC m=+1230.003037192" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.827107 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" podStartSLOduration=35.827074655 podStartE2EDuration="35.827074655s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:15:05.822297286 +0000 UTC m=+1230.043658194" watchObservedRunningTime="2026-03-20 11:15:05.827074655 +0000 UTC m=+1230.048435553" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.188291 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298168 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298304 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298334 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.299617 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume" (OuterVolumeSpecName: "config-volume") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.308837 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2" (OuterVolumeSpecName: "kube-api-access-cnfm2") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "kube-api-access-cnfm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.315022 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400576 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400632 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400647 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.803539 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" event={"ID":"ecf64e38-138d-4ef7-8b17-c09f30358f3e","Type":"ContainerStarted","Data":"a65f23e5c33dc643e187e8f730a3a764838674bd742f1ae12902e714eb89978e"} Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.803702 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.805966 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerDied","Data":"1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa"} Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.806008 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.806024 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.838215 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" podStartSLOduration=36.250226869 podStartE2EDuration="39.838195337s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:15:04.591621423 +0000 UTC m=+1228.812982321" lastFinishedPulling="2026-03-20 11:15:08.179589891 +0000 UTC m=+1232.400950789" observedRunningTime="2026-03-20 11:15:08.834148397 +0000 UTC m=+1233.055509295" watchObservedRunningTime="2026-03-20 11:15:08.838195337 +0000 UTC m=+1233.059556235" Mar 20 11:15:09 crc kubenswrapper[4860]: I0320 11:15:09.909284 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:15:09 crc kubenswrapper[4860]: I0320 11:15:09.966584 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.129395 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.366119 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.442344 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.452691 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.631373 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.789144 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:15:11 crc kubenswrapper[4860]: I0320 11:15:11.179287 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:15:12 crc kubenswrapper[4860]: I0320 11:15:12.872578 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:13 crc kubenswrapper[4860]: E0320 11:15:13.416131 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:15:15 crc kubenswrapper[4860]: I0320 11:15:15.898260 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:15:22 crc kubenswrapper[4860]: I0320 11:15:22.347859 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:26 crc kubenswrapper[4860]: I0320 11:15:26.416454 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:15:27 crc kubenswrapper[4860]: I0320 11:15:27.981275 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" event={"ID":"d7202366-6dc1-45ca-bb9a-74bdd0426c5f","Type":"ContainerStarted","Data":"cab15acc2448b50eaeed93fa772423a9f249e78bfcf9d91b29b289883986dd27"} Mar 20 11:15:27 crc kubenswrapper[4860]: I0320 11:15:27.982130 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:15:28 crc kubenswrapper[4860]: I0320 11:15:28.003245 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podStartSLOduration=3.880607048 podStartE2EDuration="59.003191678s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.559932274 +0000 UTC m=+1196.781293172" lastFinishedPulling="2026-03-20 11:15:27.682516894 +0000 UTC m=+1251.903877802" observedRunningTime="2026-03-20 11:15:27.999474337 +0000 UTC m=+1252.220835235" watchObservedRunningTime="2026-03-20 11:15:28.003191678 +0000 UTC m=+1252.224552576" Mar 20 11:15:40 crc kubenswrapper[4860]: I0320 11:15:40.620034 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.568565 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:55 crc kubenswrapper[4860]: E0320 11:15:55.569925 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.569947 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.570143 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.571172 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616718 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616728 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tt45n" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.617023 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.629756 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.714780 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.716423 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.718862 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.719031 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.719088 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.725973 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820705 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820785 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820818 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820983 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.821104 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.822314 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.844464 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923365 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923518 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923573 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.924600 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.924733 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.935955 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.942618 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:56 crc kubenswrapper[4860]: I0320 11:15:56.156883 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:56 crc kubenswrapper[4860]: I0320 11:15:56.789570 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.088550 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.219883 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerStarted","Data":"80f7e07847f25b420187a5f84e199c965d08309e8c61a1c3ef33ff79bb484a84"} Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.220950 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerStarted","Data":"6a85cdad59940ceecd7205c5b097a140eb3a8f0947b37763cb9efb1b4fece931"} Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.158673 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.161005 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.165836 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.168361 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.171619 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.172033 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.245961 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.347462 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.370860 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.496384 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:01 crc kubenswrapper[4860]: I0320 11:16:01.269879 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:06 crc kubenswrapper[4860]: I0320 11:16:06.389670 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerStarted","Data":"e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a"} Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.730700 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.731846 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmlqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sjw98_openstack(ccb7e541-f715-4030-8091-91f7e9eacb4c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.733053 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podUID="ccb7e541-f715-4030-8091-91f7e9eacb4c" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.759044 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.759294 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dzt8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dbwww_openstack(f1ac2367-dd61-4085-a756-ab4244a03144): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.760483 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" Mar 20 11:16:20 crc kubenswrapper[4860]: I0320 11:16:20.658529 4860 generic.go:334] "Generic (PLEG): container finished" podID="638de697-8881-4bb2-b204-2e87655dccbf" containerID="615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed" exitCode=0 Mar 20 11:16:20 crc kubenswrapper[4860]: I0320 11:16:20.658584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerDied","Data":"615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed"} Mar 20 11:16:20 crc kubenswrapper[4860]: E0320 11:16:20.660729 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podUID="ccb7e541-f715-4030-8091-91f7e9eacb4c" Mar 20 11:16:20 crc kubenswrapper[4860]: E0320 11:16:20.660778 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" Mar 20 11:16:21 crc kubenswrapper[4860]: I0320 11:16:21.962619 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.092737 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"638de697-8881-4bb2-b204-2e87655dccbf\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.100530 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5" (OuterVolumeSpecName: "kube-api-access-j5sn5") pod "638de697-8881-4bb2-b204-2e87655dccbf" (UID: "638de697-8881-4bb2-b204-2e87655dccbf"). InnerVolumeSpecName "kube-api-access-j5sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.194281 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerDied","Data":"e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a"} Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677907 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677472 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.038071 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.044020 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.423624 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" path="/var/lib/kubelet/pods/b45dae17-b8e6-4d57-a525-2892e7ff37f7/volumes" Mar 20 11:16:33 crc kubenswrapper[4860]: I0320 11:16:33.770534 4860 generic.go:334] "Generic (PLEG): container finished" podID="ccb7e541-f715-4030-8091-91f7e9eacb4c" containerID="436c166a207b8ff3d6c5eee2e92fa2901918a501a46fa1762e6609f84378e3b4" exitCode=0 Mar 20 11:16:33 crc kubenswrapper[4860]: I0320 11:16:33.770636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerDied","Data":"436c166a207b8ff3d6c5eee2e92fa2901918a501a46fa1762e6609f84378e3b4"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.785621 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerStarted","Data":"4219324ec803228788fd007f3cb36c789afe296d11d52a560d37bcdf324d7f39"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.786529 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.788007 4860 generic.go:334] "Generic (PLEG): container finished" podID="f1ac2367-dd61-4085-a756-ab4244a03144" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" exitCode=0 Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.788048 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.810548 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podStartSLOduration=4.025659316 podStartE2EDuration="39.81052246s" podCreationTimestamp="2026-03-20 11:15:55 +0000 UTC" firstStartedPulling="2026-03-20 11:15:57.09613722 +0000 UTC m=+1281.317498118" lastFinishedPulling="2026-03-20 11:16:32.881000364 +0000 UTC m=+1317.102361262" observedRunningTime="2026-03-20 11:16:34.805882464 +0000 UTC m=+1319.027243382" watchObservedRunningTime="2026-03-20 11:16:34.81052246 +0000 UTC m=+1319.031883358" Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.796990 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerStarted","Data":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.797843 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.826538 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podStartSLOduration=-9223371996.028267 podStartE2EDuration="40.826509217s" podCreationTimestamp="2026-03-20 11:15:55 +0000 UTC" firstStartedPulling="2026-03-20 11:15:56.796302041 +0000 UTC m=+1281.017662939" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:16:35.818640794 +0000 UTC m=+1320.040001722" watchObservedRunningTime="2026-03-20 11:16:35.826509217 +0000 UTC m=+1320.047870105" Mar 20 11:16:40 crc kubenswrapper[4860]: I0320 11:16:40.938546 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.159039 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.208577 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.862560 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" containerID="cri-o://e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" gracePeriod=10 Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.273686 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.468106 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"f1ac2367-dd61-4085-a756-ab4244a03144\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.468733 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"f1ac2367-dd61-4085-a756-ab4244a03144\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.475665 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b" (OuterVolumeSpecName: "kube-api-access-dzt8b") pod "f1ac2367-dd61-4085-a756-ab4244a03144" (UID: "f1ac2367-dd61-4085-a756-ab4244a03144"). InnerVolumeSpecName "kube-api-access-dzt8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.505867 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config" (OuterVolumeSpecName: "config") pod "f1ac2367-dd61-4085-a756-ab4244a03144" (UID: "f1ac2367-dd61-4085-a756-ab4244a03144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.570254 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.570312 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873020 4860 generic.go:334] "Generic (PLEG): container finished" podID="f1ac2367-dd61-4085-a756-ab4244a03144" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" exitCode=0 Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873059 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873118 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"80f7e07847f25b420187a5f84e199c965d08309e8c61a1c3ef33ff79bb484a84"} Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873134 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873143 4860 scope.go:117] "RemoveContainer" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.908428 4860 scope.go:117] "RemoveContainer" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.914460 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.921297 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.933406 4860 scope.go:117] "RemoveContainer" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: E0320 11:16:42.934179 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": container with ID starting with e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864 not found: ID does not exist" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934250 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} err="failed to get container status \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": rpc error: code = NotFound desc = could not find container \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": container with ID starting with e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864 not found: ID does not exist" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934284 4860 scope.go:117] "RemoveContainer" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: E0320 11:16:42.934766 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": container with ID starting with b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc not found: ID does not exist" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934795 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc"} err="failed to get container status \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": rpc error: code = NotFound desc = could not find container \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": container with ID starting with b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc not found: ID does not exist" Mar 20 11:16:43 crc kubenswrapper[4860]: I0320 11:16:43.423150 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" path="/var/lib/kubelet/pods/f1ac2367-dd61-4085-a756-ab4244a03144/volumes" Mar 20 11:16:47 crc kubenswrapper[4860]: I0320 11:16:47.834388 4860 scope.go:117] "RemoveContainer" containerID="bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615" Mar 20 11:17:22 crc kubenswrapper[4860]: I0320 11:17:22.344075 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:22 crc kubenswrapper[4860]: I0320 11:17:22.345046 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:52 crc kubenswrapper[4860]: I0320 11:17:52.346384 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:52 crc kubenswrapper[4860]: I0320 11:17:52.347065 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.153429 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154752 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154795 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="init" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154802 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="init" Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154827 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154835 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155015 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155032 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155766 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.159976 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.160575 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.162305 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.167010 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.343398 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.445399 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.475561 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.479265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.925350 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:01 crc kubenswrapper[4860]: I0320 11:18:01.517646 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerStarted","Data":"77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879"} Mar 20 11:18:02 crc kubenswrapper[4860]: I0320 11:18:02.527384 4860 generic.go:334] "Generic (PLEG): container finished" podID="3d4db42a-d915-4c4d-a985-be77a5381514" containerID="666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c" exitCode=0 Mar 20 11:18:02 crc kubenswrapper[4860]: I0320 11:18:02.527448 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerDied","Data":"666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c"} Mar 20 11:18:03 crc kubenswrapper[4860]: I0320 11:18:03.822097 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.005283 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"3d4db42a-d915-4c4d-a985-be77a5381514\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.013605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5" (OuterVolumeSpecName: "kube-api-access-zs5x5") pod "3d4db42a-d915-4c4d-a985-be77a5381514" (UID: "3d4db42a-d915-4c4d-a985-be77a5381514"). InnerVolumeSpecName "kube-api-access-zs5x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.107264 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") on node \"crc\" DevicePath \"\"" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerDied","Data":"77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879"} Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545911 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545757 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.899980 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.906430 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:18:05 crc kubenswrapper[4860]: I0320 11:18:05.429012 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" path="/var/lib/kubelet/pods/fdc939b6-92ac-4e00-ae32-b518e4257043/volumes" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.344780 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.345674 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.345736 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.346707 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.346774 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" gracePeriod=600 Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.697663 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" exitCode=0 Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.697739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.698196 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.698280 4860 scope.go:117] "RemoveContainer" containerID="88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" Mar 20 11:18:47 crc kubenswrapper[4860]: I0320 11:18:47.927973 4860 scope.go:117] "RemoveContainer" containerID="986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.157208 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:00 crc kubenswrapper[4860]: E0320 11:20:00.158284 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.158304 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.158559 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.159252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.162713 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.163131 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.167180 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.173783 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.289042 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.391373 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.415084 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.482948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.917562 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:01 crc kubenswrapper[4860]: I0320 11:20:01.603851 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerStarted","Data":"a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf"} Mar 20 11:20:02 crc kubenswrapper[4860]: I0320 11:20:02.615577 4860 generic.go:334] "Generic (PLEG): container finished" podID="02d4a854-e21a-46b4-976b-17645af17c8b" containerID="b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7" exitCode=0 Mar 20 11:20:02 crc kubenswrapper[4860]: I0320 11:20:02.615687 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerDied","Data":"b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7"} Mar 20 11:20:03 crc kubenswrapper[4860]: I0320 11:20:03.907776 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.048774 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"02d4a854-e21a-46b4-976b-17645af17c8b\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.055072 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs" (OuterVolumeSpecName: "kube-api-access-bhtrs") pod "02d4a854-e21a-46b4-976b-17645af17c8b" (UID: "02d4a854-e21a-46b4-976b-17645af17c8b"). InnerVolumeSpecName "kube-api-access-bhtrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.151069 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634637 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerDied","Data":"a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf"} Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634699 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634707 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.976539 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.984181 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:20:05 crc kubenswrapper[4860]: I0320 11:20:05.429515 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" path="/var/lib/kubelet/pods/a90da115-522c-4858-935f-7d4a7211c8cb/volumes" Mar 20 11:20:22 crc kubenswrapper[4860]: I0320 11:20:22.344551 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:22 crc kubenswrapper[4860]: I0320 11:20:22.345461 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:20:48 crc kubenswrapper[4860]: I0320 11:20:48.013109 4860 scope.go:117] "RemoveContainer" containerID="f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8" Mar 20 11:20:52 crc kubenswrapper[4860]: I0320 11:20:52.345116 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:52 crc kubenswrapper[4860]: I0320 11:20:52.347710 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.345281 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346045 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346113 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346941 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.347010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" gracePeriod=600 Mar 20 11:21:22 crc kubenswrapper[4860]: E0320 11:21:22.471775 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285063 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" exitCode=0 Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285103 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285698 4860 scope.go:117] "RemoveContainer" containerID="30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.286477 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:23 crc kubenswrapper[4860]: E0320 11:21:23.286767 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.040628 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:29 crc kubenswrapper[4860]: E0320 11:21:29.042176 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.042199 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.042462 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.043889 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.061866 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084853 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084903 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084956 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186339 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186496 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186530 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.187299 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.187319 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.208958 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.380216 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.853520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369255 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" exitCode=0 Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369459 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf"} Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369566 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"fcc3ad862451d2531d00ef7f4051c1f59243588e81decdce68473cd448e73080"} Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.372296 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:21:32 crc kubenswrapper[4860]: I0320 11:21:32.390007 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} Mar 20 11:21:33 crc kubenswrapper[4860]: I0320 11:21:33.399407 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" exitCode=0 Mar 20 11:21:33 crc kubenswrapper[4860]: I0320 11:21:33.399520 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.410619 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.413859 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:34 crc kubenswrapper[4860]: E0320 11:21:34.414272 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.437767 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9gvjr" podStartSLOduration=1.902814372 podStartE2EDuration="5.437739146s" podCreationTimestamp="2026-03-20 11:21:29 +0000 UTC" firstStartedPulling="2026-03-20 11:21:30.372072807 +0000 UTC m=+1614.593433695" lastFinishedPulling="2026-03-20 11:21:33.906997571 +0000 UTC m=+1618.128358469" observedRunningTime="2026-03-20 11:21:34.431200008 +0000 UTC m=+1618.652560906" watchObservedRunningTime="2026-03-20 11:21:34.437739146 +0000 UTC m=+1618.659100044" Mar 20 11:21:39 crc kubenswrapper[4860]: I0320 11:21:39.380665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:39 crc kubenswrapper[4860]: I0320 11:21:39.381090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:40 crc kubenswrapper[4860]: I0320 11:21:40.430303 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9gvjr" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" probeResult="failure" output=< Mar 20 11:21:40 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:21:40 crc kubenswrapper[4860]: > Mar 20 11:21:48 crc kubenswrapper[4860]: I0320 11:21:48.414108 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:48 crc kubenswrapper[4860]: E0320 11:21:48.415302 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.430482 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.477800 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.665126 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:50 crc kubenswrapper[4860]: I0320 11:21:50.554220 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9gvjr" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" containerID="cri-o://a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" gracePeriod=2 Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.513651 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564309 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" exitCode=0 Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564369 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"fcc3ad862451d2531d00ef7f4051c1f59243588e81decdce68473cd448e73080"} Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564430 4860 scope.go:117] "RemoveContainer" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564657 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.590299 4860 scope.go:117] "RemoveContainer" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.615738 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.615887 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.616000 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.616069 4860 scope.go:117] "RemoveContainer" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.617560 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities" (OuterVolumeSpecName: "utilities") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.633731 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx" (OuterVolumeSpecName: "kube-api-access-v5zsx") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "kube-api-access-v5zsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.688421 4860 scope.go:117] "RemoveContainer" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.706880 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": container with ID starting with a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd not found: ID does not exist" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.706946 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} err="failed to get container status \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": rpc error: code = NotFound desc = could not find container \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": container with ID starting with a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707071 4860 scope.go:117] "RemoveContainer" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.707558 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": container with ID starting with a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f not found: ID does not exist" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707592 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} err="failed to get container status \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": rpc error: code = NotFound desc = could not find container \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": container with ID starting with a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707609 4860 scope.go:117] "RemoveContainer" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.708156 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": container with ID starting with bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf not found: ID does not exist" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.708200 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf"} err="failed to get container status \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": rpc error: code = NotFound desc = could not find container \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": container with ID starting with bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.717582 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.717621 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.764608 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.819441 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.898987 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.913392 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:53 crc kubenswrapper[4860]: I0320 11:21:53.424284 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" path="/var/lib/kubelet/pods/8fa1c68c-e71f-456c-a53f-1ba28dd3952f/volumes" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.149561 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.150936 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.150956 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.150980 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.150988 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.151022 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151031 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151252 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151982 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.155131 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.155707 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.156477 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.161681 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.262322 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.363678 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.390038 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.413910 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.414189 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.490422 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.993599 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:01 crc kubenswrapper[4860]: I0320 11:22:01.666251 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerStarted","Data":"b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7"} Mar 20 11:22:03 crc kubenswrapper[4860]: I0320 11:22:03.683507 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerStarted","Data":"35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd"} Mar 20 11:22:04 crc kubenswrapper[4860]: I0320 11:22:04.695980 4860 generic.go:334] "Generic (PLEG): container finished" podID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerID="35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd" exitCode=0 Mar 20 11:22:04 crc kubenswrapper[4860]: I0320 11:22:04.696101 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerDied","Data":"35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd"} Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.036617 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.139770 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"92fe5f45-6751-47ad-ba62-ce45b44f7460\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.148634 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl" (OuterVolumeSpecName: "kube-api-access-w5bnl") pod "92fe5f45-6751-47ad-ba62-ce45b44f7460" (UID: "92fe5f45-6751-47ad-ba62-ce45b44f7460"). InnerVolumeSpecName "kube-api-access-w5bnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.241792 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerDied","Data":"b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7"} Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708869 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708999 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:06 crc kubenswrapper[4860]: I0320 11:22:06.115733 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:22:06 crc kubenswrapper[4860]: I0320 11:22:06.122184 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:22:07 crc kubenswrapper[4860]: I0320 11:22:07.421970 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638de697-8881-4bb2-b204-2e87655dccbf" path="/var/lib/kubelet/pods/638de697-8881-4bb2-b204-2e87655dccbf/volumes" Mar 20 11:22:12 crc kubenswrapper[4860]: I0320 11:22:12.413703 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:12 crc kubenswrapper[4860]: E0320 11:22:12.414898 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.110099 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:23 crc kubenswrapper[4860]: E0320 11:22:23.111006 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.111022 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.111206 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.112590 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.126347 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253050 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253150 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253438 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355111 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355675 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.356239 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.379298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.413797 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:23 crc kubenswrapper[4860]: E0320 11:22:23.414164 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.441250 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.959916 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.864701 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" exitCode=0 Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.865255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726"} Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.865292 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"b5fa759c5bdc4dc59a530ac28ff28e464c1717a4acb2944a6d76e0c54f12fb36"} Mar 20 11:22:25 crc kubenswrapper[4860]: I0320 11:22:25.878149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} Mar 20 11:22:26 crc kubenswrapper[4860]: I0320 11:22:26.888938 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" exitCode=0 Mar 20 11:22:26 crc kubenswrapper[4860]: I0320 11:22:26.889002 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} Mar 20 11:22:28 crc kubenswrapper[4860]: I0320 11:22:28.912081 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} Mar 20 11:22:28 crc kubenswrapper[4860]: I0320 11:22:28.940022 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55zcw" podStartSLOduration=2.715966631 podStartE2EDuration="5.939993554s" podCreationTimestamp="2026-03-20 11:22:23 +0000 UTC" firstStartedPulling="2026-03-20 11:22:24.86715643 +0000 UTC m=+1669.088517328" lastFinishedPulling="2026-03-20 11:22:28.091183353 +0000 UTC m=+1672.312544251" observedRunningTime="2026-03-20 11:22:28.939991354 +0000 UTC m=+1673.161352262" watchObservedRunningTime="2026-03-20 11:22:28.939993554 +0000 UTC m=+1673.161354452" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.782512 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.784895 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.798842 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905445 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905509 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007667 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007805 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007841 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.008657 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.008756 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.031519 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.105801 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.442103 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.442643 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.491993 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.590466 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.957482 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" exitCode=0 Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.957977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1"} Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.958407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerStarted","Data":"ed55ee2e36d27fdb0b0342cb9867725ba680f9b8414c697e3fd56d92789efcf2"} Mar 20 11:22:34 crc kubenswrapper[4860]: I0320 11:22:34.004573 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.741563 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982505 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" exitCode=0 Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982555 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418"} Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982814 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55zcw" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" containerID="cri-o://781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" gracePeriod=2 Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.392070 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471824 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471893 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471945 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.473929 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities" (OuterVolumeSpecName: "utilities") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.481195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5" (OuterVolumeSpecName: "kube-api-access-gx6q5") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "kube-api-access-gx6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.574517 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.574562 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.994040 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerStarted","Data":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000167 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" exitCode=0 Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000263 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000314 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"b5fa759c5bdc4dc59a530ac28ff28e464c1717a4acb2944a6d76e0c54f12fb36"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000314 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000341 4860 scope.go:117] "RemoveContainer" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.028927 4860 scope.go:117] "RemoveContainer" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.032864 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vlqnl" podStartSLOduration=2.492740575 podStartE2EDuration="5.032847578s" podCreationTimestamp="2026-03-20 11:22:32 +0000 UTC" firstStartedPulling="2026-03-20 11:22:33.959636096 +0000 UTC m=+1678.180997024" lastFinishedPulling="2026-03-20 11:22:36.499743109 +0000 UTC m=+1680.721104027" observedRunningTime="2026-03-20 11:22:37.023992677 +0000 UTC m=+1681.245353585" watchObservedRunningTime="2026-03-20 11:22:37.032847578 +0000 UTC m=+1681.254208476" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.052455 4860 scope.go:117] "RemoveContainer" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.071544 4860 scope.go:117] "RemoveContainer" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.072205 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": container with ID starting with 781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d not found: ID does not exist" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.072275 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} err="failed to get container status \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": rpc error: code = NotFound desc = could not find container \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": container with ID starting with 781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.072316 4860 scope.go:117] "RemoveContainer" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.072984 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": container with ID starting with c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e not found: ID does not exist" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.073057 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} err="failed to get container status \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": rpc error: code = NotFound desc = could not find container \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": container with ID starting with c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.073120 4860 scope.go:117] "RemoveContainer" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.074755 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": container with ID starting with 7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726 not found: ID does not exist" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.074807 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726"} err="failed to get container status \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": rpc error: code = NotFound desc = could not find container \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": container with ID starting with 7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726 not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.349335 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.387924 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.418550 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.418871 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.624172 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.631465 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:39 crc kubenswrapper[4860]: I0320 11:22:39.447533 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" path="/var/lib/kubelet/pods/4a827151-75d9-472f-9cd8-bd45629d4c42/volumes" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.106545 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.107029 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.160166 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:44 crc kubenswrapper[4860]: I0320 11:22:44.128182 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:44 crc kubenswrapper[4860]: I0320 11:22:44.192447 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:46 crc kubenswrapper[4860]: I0320 11:22:46.085554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vlqnl" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" containerID="cri-o://5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" gracePeriod=2 Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.081210 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101509 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" exitCode=0 Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101554 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"ed55ee2e36d27fdb0b0342cb9867725ba680f9b8414c697e3fd56d92789efcf2"} Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101604 4860 scope.go:117] "RemoveContainer" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101738 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.133089 4860 scope.go:117] "RemoveContainer" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.153183 4860 scope.go:117] "RemoveContainer" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156205 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156290 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156427 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.157504 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities" (OuterVolumeSpecName: "utilities") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.164172 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v" (OuterVolumeSpecName: "kube-api-access-9bz6v") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "kube-api-access-9bz6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.182026 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205138 4860 scope.go:117] "RemoveContainer" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.205594 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": container with ID starting with 5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc not found: ID does not exist" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} err="failed to get container status \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": rpc error: code = NotFound desc = could not find container \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": container with ID starting with 5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205678 4860 scope.go:117] "RemoveContainer" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.205902 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": container with ID starting with 5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418 not found: ID does not exist" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205933 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418"} err="failed to get container status \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": rpc error: code = NotFound desc = could not find container \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": container with ID starting with 5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418 not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205959 4860 scope.go:117] "RemoveContainer" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.206500 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": container with ID starting with 0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1 not found: ID does not exist" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.206530 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1"} err="failed to get container status \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": rpc error: code = NotFound desc = could not find container \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": container with ID starting with 0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1 not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258266 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258311 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258326 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.440385 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.450855 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:48 crc kubenswrapper[4860]: I0320 11:22:48.087826 4860 scope.go:117] "RemoveContainer" containerID="615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed" Mar 20 11:22:48 crc kubenswrapper[4860]: I0320 11:22:48.413641 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:48 crc kubenswrapper[4860]: E0320 11:22:48.413941 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:49 crc kubenswrapper[4860]: I0320 11:22:49.432979 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" path="/var/lib/kubelet/pods/8225bb93-169a-41cc-bdec-d466c6aa140f/volumes" Mar 20 11:23:03 crc kubenswrapper[4860]: I0320 11:23:03.414907 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:03 crc kubenswrapper[4860]: E0320 11:23:03.416737 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:14 crc kubenswrapper[4860]: I0320 11:23:14.413688 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:14 crc kubenswrapper[4860]: E0320 11:23:14.414254 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:25 crc kubenswrapper[4860]: I0320 11:23:25.414021 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:25 crc kubenswrapper[4860]: E0320 11:23:25.414760 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:38 crc kubenswrapper[4860]: I0320 11:23:38.413116 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:38 crc kubenswrapper[4860]: E0320 11:23:38.414236 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:50 crc kubenswrapper[4860]: I0320 11:23:50.414075 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:50 crc kubenswrapper[4860]: E0320 11:23:50.415159 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.155817 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157388 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157410 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157439 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157449 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157462 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157472 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157491 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157501 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157516 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157524 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157535 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157547 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157779 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157794 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.158520 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.161359 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.161632 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.165114 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.165987 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.286530 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.387698 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.408564 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.479836 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.942140 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:01 crc kubenswrapper[4860]: I0320 11:24:01.731967 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerStarted","Data":"fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c"} Mar 20 11:24:05 crc kubenswrapper[4860]: I0320 11:24:05.414181 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:05 crc kubenswrapper[4860]: E0320 11:24:05.414942 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:07 crc kubenswrapper[4860]: I0320 11:24:07.779936 4860 generic.go:334] "Generic (PLEG): container finished" podID="6d332346-eeda-4316-9757-20948492ca2a" containerID="c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694" exitCode=0 Mar 20 11:24:07 crc kubenswrapper[4860]: I0320 11:24:07.780466 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerDied","Data":"c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694"} Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.058886 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.139598 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"6d332346-eeda-4316-9757-20948492ca2a\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.148332 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr" (OuterVolumeSpecName: "kube-api-access-p2ntr") pod "6d332346-eeda-4316-9757-20948492ca2a" (UID: "6d332346-eeda-4316-9757-20948492ca2a"). InnerVolumeSpecName "kube-api-access-p2ntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.242163 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") on node \"crc\" DevicePath \"\"" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797055 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerDied","Data":"fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c"} Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797465 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797084 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:10 crc kubenswrapper[4860]: I0320 11:24:10.146015 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:24:10 crc kubenswrapper[4860]: I0320 11:24:10.152143 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:24:11 crc kubenswrapper[4860]: I0320 11:24:11.423219 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" path="/var/lib/kubelet/pods/3d4db42a-d915-4c4d-a985-be77a5381514/volumes" Mar 20 11:24:17 crc kubenswrapper[4860]: I0320 11:24:17.418581 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:17 crc kubenswrapper[4860]: E0320 11:24:17.419889 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:28 crc kubenswrapper[4860]: I0320 11:24:28.414156 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:28 crc kubenswrapper[4860]: E0320 11:24:28.415033 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:40 crc kubenswrapper[4860]: I0320 11:24:40.413745 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:40 crc kubenswrapper[4860]: E0320 11:24:40.415172 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:48 crc kubenswrapper[4860]: I0320 11:24:48.212119 4860 scope.go:117] "RemoveContainer" containerID="666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c" Mar 20 11:24:53 crc kubenswrapper[4860]: I0320 11:24:53.414006 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:53 crc kubenswrapper[4860]: E0320 11:24:53.414769 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:06 crc kubenswrapper[4860]: I0320 11:25:06.416092 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:06 crc kubenswrapper[4860]: E0320 11:25:06.418751 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:21 crc kubenswrapper[4860]: I0320 11:25:21.413257 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:21 crc kubenswrapper[4860]: E0320 11:25:21.414308 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:34 crc kubenswrapper[4860]: I0320 11:25:34.414001 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:34 crc kubenswrapper[4860]: E0320 11:25:34.415159 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:45 crc kubenswrapper[4860]: I0320 11:25:45.413443 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:45 crc kubenswrapper[4860]: E0320 11:25:45.414402 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:57 crc kubenswrapper[4860]: I0320 11:25:57.420128 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:57 crc kubenswrapper[4860]: E0320 11:25:57.421209 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.152395 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:00 crc kubenswrapper[4860]: E0320 11:26:00.153412 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.153438 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.153638 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.154437 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.157732 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.158086 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.162698 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.164766 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.166084 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.266155 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.293140 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.483671 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.929342 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:01 crc kubenswrapper[4860]: I0320 11:26:01.721108 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerStarted","Data":"db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f"} Mar 20 11:26:02 crc kubenswrapper[4860]: I0320 11:26:02.731502 4860 generic.go:334] "Generic (PLEG): container finished" podID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerID="c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e" exitCode=0 Mar 20 11:26:02 crc kubenswrapper[4860]: I0320 11:26:02.731559 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerDied","Data":"c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e"} Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.078008 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.242657 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"8a62a673-bde2-4cf8-bae1-56252a15c71e\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.251801 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5" (OuterVolumeSpecName: "kube-api-access-prqr5") pod "8a62a673-bde2-4cf8-bae1-56252a15c71e" (UID: "8a62a673-bde2-4cf8-bae1-56252a15c71e"). InnerVolumeSpecName "kube-api-access-prqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.344905 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerDied","Data":"db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f"} Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750585 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750115 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.151599 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.157310 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.422482 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" path="/var/lib/kubelet/pods/02d4a854-e21a-46b4-976b-17645af17c8b/volumes" Mar 20 11:26:11 crc kubenswrapper[4860]: I0320 11:26:11.413923 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:26:11 crc kubenswrapper[4860]: E0320 11:26:11.415050 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:26:25 crc kubenswrapper[4860]: I0320 11:26:25.413550 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:26:25 crc kubenswrapper[4860]: I0320 11:26:25.924622 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} Mar 20 11:26:48 crc kubenswrapper[4860]: I0320 11:26:48.320828 4860 scope.go:117] "RemoveContainer" containerID="b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.191791 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: E0320 11:28:00.193150 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.193168 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.193412 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.194085 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.199527 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.199711 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.200199 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.201495 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.273624 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.374972 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.397748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.517485 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.975285 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.986070 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:28:01 crc kubenswrapper[4860]: I0320 11:28:01.725783 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerStarted","Data":"a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6"} Mar 20 11:28:02 crc kubenswrapper[4860]: I0320 11:28:02.735214 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerStarted","Data":"43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f"} Mar 20 11:28:02 crc kubenswrapper[4860]: I0320 11:28:02.758006 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" podStartSLOduration=1.425143617 podStartE2EDuration="2.757982498s" podCreationTimestamp="2026-03-20 11:28:00 +0000 UTC" firstStartedPulling="2026-03-20 11:28:00.985762892 +0000 UTC m=+2005.207123800" lastFinishedPulling="2026-03-20 11:28:02.318601773 +0000 UTC m=+2006.539962681" observedRunningTime="2026-03-20 11:28:02.75472627 +0000 UTC m=+2006.976087168" watchObservedRunningTime="2026-03-20 11:28:02.757982498 +0000 UTC m=+2006.979343396" Mar 20 11:28:03 crc kubenswrapper[4860]: I0320 11:28:03.744502 4860 generic.go:334] "Generic (PLEG): container finished" podID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerID="43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f" exitCode=0 Mar 20 11:28:03 crc kubenswrapper[4860]: I0320 11:28:03.744573 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerDied","Data":"43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f"} Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.089136 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.275272 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.282436 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql" (OuterVolumeSpecName: "kube-api-access-85vql") pod "e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" (UID: "e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2"). InnerVolumeSpecName "kube-api-access-85vql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.377657 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") on node \"crc\" DevicePath \"\"" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761326 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerDied","Data":"a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6"} Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761780 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761399 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.826941 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.833741 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:28:07 crc kubenswrapper[4860]: I0320 11:28:07.422792 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" path="/var/lib/kubelet/pods/92fe5f45-6751-47ad-ba62-ce45b44f7460/volumes" Mar 20 11:28:48 crc kubenswrapper[4860]: I0320 11:28:48.423803 4860 scope.go:117] "RemoveContainer" containerID="35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd" Mar 20 11:28:52 crc kubenswrapper[4860]: I0320 11:28:52.344262 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:28:52 crc kubenswrapper[4860]: I0320 11:28:52.344764 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:22 crc kubenswrapper[4860]: I0320 11:29:22.344767 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:22 crc kubenswrapper[4860]: I0320 11:29:22.345830 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344136 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344873 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344993 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.346458 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.346617 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" gracePeriod=600 Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651086 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" exitCode=0 Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651192 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651950 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:29:53 crc kubenswrapper[4860]: I0320 11:29:53.664940 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.170702 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:00 crc kubenswrapper[4860]: E0320 11:30:00.171917 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.171936 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.172115 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.172870 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.175801 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.176619 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.183688 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.185485 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.191955 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.192286 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.192370 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.193245 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.200805 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216882 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216944 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216981 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.217041 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.317763 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318189 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318378 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318520 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.319938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.326982 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.336046 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.337074 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.497025 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.514460 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.961309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.009512 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:01 crc kubenswrapper[4860]: W0320 11:30:01.017166 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490c18f1_2364_4847_a7c9_5f603a7dbde2.slice/crio-454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3 WatchSource:0}: Error finding container 454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3: Status 404 returned error can't find the container with id 454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3 Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.763635 4860 generic.go:334] "Generic (PLEG): container finished" podID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerID="dea0662d075d177bbae1d376b3ebc31bedea7bf86ce0257b207c8d237e8238e0" exitCode=0 Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.763756 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerDied","Data":"dea0662d075d177bbae1d376b3ebc31bedea7bf86ce0257b207c8d237e8238e0"} Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.764565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerStarted","Data":"454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3"} Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.766644 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerStarted","Data":"1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0"} Mar 20 11:30:02 crc kubenswrapper[4860]: I0320 11:30:02.775058 4860 generic.go:334] "Generic (PLEG): container finished" podID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerID="6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a" exitCode=0 Mar 20 11:30:02 crc kubenswrapper[4860]: I0320 11:30:02.775111 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerDied","Data":"6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a"} Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.070764 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173803 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173864 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173934 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.174906 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume" (OuterVolumeSpecName: "config-volume") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.180498 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.180700 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p" (OuterVolumeSpecName: "kube-api-access-4wg4p") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "kube-api-access-4wg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275186 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275262 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275275 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788076 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788841 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerDied","Data":"454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3"} Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788868 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.046768 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.145835 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.151376 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.224618 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.228089 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq" (OuterVolumeSpecName: "kube-api-access-9fkjq") pod "d911fad2-83cb-46e3-8f48-eb6f4b0e5605" (UID: "d911fad2-83cb-46e3-8f48-eb6f4b0e5605"). InnerVolumeSpecName "kube-api-access-9fkjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.327101 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796645 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerDied","Data":"1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0"} Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796699 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796751 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.101675 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.108091 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.424091 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" path="/var/lib/kubelet/pods/437c32d4-4b5f-4657-86d6-5214e3bfc01f/volumes" Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.425004 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d332346-eeda-4316-9757-20948492ca2a" path="/var/lib/kubelet/pods/6d332346-eeda-4316-9757-20948492ca2a/volumes" Mar 20 11:30:48 crc kubenswrapper[4860]: I0320 11:30:48.539192 4860 scope.go:117] "RemoveContainer" containerID="c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694" Mar 20 11:30:48 crc kubenswrapper[4860]: I0320 11:30:48.592974 4860 scope.go:117] "RemoveContainer" containerID="509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.499626 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:13 crc kubenswrapper[4860]: E0320 11:31:13.500711 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500731 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: E0320 11:31:13.500760 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500767 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500954 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500984 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.502315 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.520982 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702046 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702134 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702360 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804294 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804398 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804455 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.805246 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.805321 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.831562 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:14 crc kubenswrapper[4860]: I0320 11:31:14.127808 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:14 crc kubenswrapper[4860]: I0320 11:31:14.574154 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.406657 4860 generic.go:334] "Generic (PLEG): container finished" podID="4ab38144-c30d-4aed-884c-8ace682fe5ea" containerID="107cbe0685170349259a50f59931a9f77bb6c0d4533641b8385370d8b9ab6b8f" exitCode=0 Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.406943 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerDied","Data":"107cbe0685170349259a50f59931a9f77bb6c0d4533641b8385370d8b9ab6b8f"} Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.407178 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerStarted","Data":"86187c0243067078a3a74947d273e6d6198228ed2ad14a70469845cf59fe1145"} Mar 20 11:31:20 crc kubenswrapper[4860]: I0320 11:31:20.447377 4860 generic.go:334] "Generic (PLEG): container finished" podID="4ab38144-c30d-4aed-884c-8ace682fe5ea" containerID="853c52a81e8d09dce968d126eef52abf63c6037a7202e7f785694020dbc92c61" exitCode=0 Mar 20 11:31:20 crc kubenswrapper[4860]: I0320 11:31:20.447434 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerDied","Data":"853c52a81e8d09dce968d126eef52abf63c6037a7202e7f785694020dbc92c61"} Mar 20 11:31:21 crc kubenswrapper[4860]: I0320 11:31:21.458718 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerStarted","Data":"75ebac83878486e06b87ecb22c6114329904482ce09343e98ff64010821eafca"} Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.128403 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.128942 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.202580 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.229398 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvhg7" podStartSLOduration=5.701737457 podStartE2EDuration="11.22937799s" podCreationTimestamp="2026-03-20 11:31:13 +0000 UTC" firstStartedPulling="2026-03-20 11:31:15.409081237 +0000 UTC m=+2199.630442135" lastFinishedPulling="2026-03-20 11:31:20.93672177 +0000 UTC m=+2205.158082668" observedRunningTime="2026-03-20 11:31:21.47819002 +0000 UTC m=+2205.699550948" watchObservedRunningTime="2026-03-20 11:31:24.22937799 +0000 UTC m=+2208.450738878" Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.287907 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.386540 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.451639 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.452021 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2x6p" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" containerID="cri-o://c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" gracePeriod=2 Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.617633 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" exitCode=0 Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.617724 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970"} Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.689936 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.790998 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.791068 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.791119 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.792359 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities" (OuterVolumeSpecName: "utilities") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.823913 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr" (OuterVolumeSpecName: "kube-api-access-267pr") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "kube-api-access-267pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.874015 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892305 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892349 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892363 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630665 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"0985eda395c30cb4fc11c5a030b8aabf733cd8d60366ed8ecb07d45313940c24"} Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630743 4860 scope.go:117] "RemoveContainer" containerID="c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630755 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.661660 4860 scope.go:117] "RemoveContainer" containerID="3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.667427 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.681068 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.690381 4860 scope.go:117] "RemoveContainer" containerID="a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5" Mar 20 11:31:37 crc kubenswrapper[4860]: I0320 11:31:37.426725 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" path="/var/lib/kubelet/pods/da2b2cab-e4d8-48ed-b198-7aff45927348/volumes" Mar 20 11:31:52 crc kubenswrapper[4860]: I0320 11:31:52.344611 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:31:52 crc kubenswrapper[4860]: I0320 11:31:52.345453 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.145385 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146253 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146268 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146285 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146292 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146309 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146316 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146482 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.147007 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.157408 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161096 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161392 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161542 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.307831 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.409331 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.442692 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.469722 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.910994 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:01 crc kubenswrapper[4860]: I0320 11:32:01.875146 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerStarted","Data":"83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e"} Mar 20 11:32:02 crc kubenswrapper[4860]: I0320 11:32:02.885077 4860 generic.go:334] "Generic (PLEG): container finished" podID="e22a4be9-9edf-4029-b504-f5c059318959" containerID="954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea" exitCode=0 Mar 20 11:32:02 crc kubenswrapper[4860]: I0320 11:32:02.885177 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerDied","Data":"954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea"} Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.224393 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.379636 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"e22a4be9-9edf-4029-b504-f5c059318959\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.389847 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx" (OuterVolumeSpecName: "kube-api-access-7b8fx") pod "e22a4be9-9edf-4029-b504-f5c059318959" (UID: "e22a4be9-9edf-4029-b504-f5c059318959"). InnerVolumeSpecName "kube-api-access-7b8fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.485683 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903588 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerDied","Data":"83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e"} Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903640 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903684 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.304416 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.311497 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.423708 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" path="/var/lib/kubelet/pods/8a62a673-bde2-4cf8-bae1-56252a15c71e/volumes" Mar 20 11:32:22 crc kubenswrapper[4860]: I0320 11:32:22.344723 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:22 crc kubenswrapper[4860]: I0320 11:32:22.345650 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.159521 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:47 crc kubenswrapper[4860]: E0320 11:32:47.162076 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.162189 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.162468 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.163966 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.166502 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.211750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.211943 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.212019 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314115 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314198 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314258 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.315493 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.315698 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.337864 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.488366 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:48 crc kubenswrapper[4860]: I0320 11:32:48.565305 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:48 crc kubenswrapper[4860]: I0320 11:32:48.679808 4860 scope.go:117] "RemoveContainer" containerID="c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e" Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265543 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" exitCode=0 Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265591 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be"} Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"84a0d03260848fa62a51d1fe5b7dc42a764bbd619dfd6c2877c6efe12414a0da"} Mar 20 11:32:50 crc kubenswrapper[4860]: I0320 11:32:50.276674 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.387358 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.388341 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.388818 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.389535 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.389623 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" gracePeriod=600 Mar 20 11:32:52 crc kubenswrapper[4860]: E0320 11:32:52.534516 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410115 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" exitCode=0 Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410771 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410897 4860 scope.go:117] "RemoveContainer" containerID="b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.411696 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:32:53 crc kubenswrapper[4860]: E0320 11:32:53.412029 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:32:54 crc kubenswrapper[4860]: I0320 11:32:54.423927 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" exitCode=0 Mar 20 11:32:54 crc kubenswrapper[4860]: I0320 11:32:54.423990 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} Mar 20 11:32:55 crc kubenswrapper[4860]: I0320 11:32:55.451574 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} Mar 20 11:32:55 crc kubenswrapper[4860]: I0320 11:32:55.481700 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h8mrm" podStartSLOduration=2.907645797 podStartE2EDuration="8.481675437s" podCreationTimestamp="2026-03-20 11:32:47 +0000 UTC" firstStartedPulling="2026-03-20 11:32:49.267839099 +0000 UTC m=+2293.489199997" lastFinishedPulling="2026-03-20 11:32:54.841868709 +0000 UTC m=+2299.063229637" observedRunningTime="2026-03-20 11:32:55.480245898 +0000 UTC m=+2299.701606796" watchObservedRunningTime="2026-03-20 11:32:55.481675437 +0000 UTC m=+2299.703036335" Mar 20 11:32:57 crc kubenswrapper[4860]: I0320 11:32:57.695818 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:57 crc kubenswrapper[4860]: I0320 11:32:57.697193 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:58 crc kubenswrapper[4860]: I0320 11:32:58.726378 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h8mrm" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" probeResult="failure" output=< Mar 20 11:32:58 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:32:58 crc kubenswrapper[4860]: > Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.418857 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:07 crc kubenswrapper[4860]: E0320 11:33:07.420034 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.536628 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.583523 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.777508 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:08 crc kubenswrapper[4860]: I0320 11:33:08.794538 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h8mrm" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" containerID="cri-o://c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" gracePeriod=2 Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.193135 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.304951 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.305171 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.305346 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.308142 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities" (OuterVolumeSpecName: "utilities") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.313774 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796" (OuterVolumeSpecName: "kube-api-access-6b796") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "kube-api-access-6b796". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.407700 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.407746 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.438784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.509849 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806037 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" exitCode=0 Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806104 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806115 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"84a0d03260848fa62a51d1fe5b7dc42a764bbd619dfd6c2877c6efe12414a0da"} Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806179 4860 scope.go:117] "RemoveContainer" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.829805 4860 scope.go:117] "RemoveContainer" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.849216 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.854823 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.877345 4860 scope.go:117] "RemoveContainer" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.897381 4860 scope.go:117] "RemoveContainer" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.898109 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": container with ID starting with c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606 not found: ID does not exist" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898150 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} err="failed to get container status \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": rpc error: code = NotFound desc = could not find container \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": container with ID starting with c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606 not found: ID does not exist" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898182 4860 scope.go:117] "RemoveContainer" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.898891 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": container with ID starting with e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2 not found: ID does not exist" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898946 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} err="failed to get container status \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": rpc error: code = NotFound desc = could not find container \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": container with ID starting with e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2 not found: ID does not exist" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898986 4860 scope.go:117] "RemoveContainer" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.899733 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": container with ID starting with e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be not found: ID does not exist" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.899806 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be"} err="failed to get container status \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": rpc error: code = NotFound desc = could not find container \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": container with ID starting with e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be not found: ID does not exist" Mar 20 11:33:11 crc kubenswrapper[4860]: I0320 11:33:11.422955 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de32a4d-295e-4e53-9224-445137c28938" path="/var/lib/kubelet/pods/2de32a4d-295e-4e53-9224-445137c28938/volumes" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.587386 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588712 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-content" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588734 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-content" Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588764 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-utilities" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-utilities" Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588787 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588792 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588927 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.590169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.627635 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.676523 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.677374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.677494 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779053 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779102 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779693 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.780166 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.801913 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.918539 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.457933 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.889105 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" exitCode=0 Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.890257 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e"} Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.890413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"983ed4b1339ec96d4ba41f7871568d90c8c49c98f3f71ee320948724affbf5ff"} Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.891640 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:33:21 crc kubenswrapper[4860]: I0320 11:33:21.898200 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.413358 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:22 crc kubenswrapper[4860]: E0320 11:33:22.414188 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.907271 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" exitCode=0 Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.907340 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} Mar 20 11:33:23 crc kubenswrapper[4860]: I0320 11:33:23.916426 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} Mar 20 11:33:23 crc kubenswrapper[4860]: I0320 11:33:23.944540 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xphpt" podStartSLOduration=2.486026149 podStartE2EDuration="4.944515409s" podCreationTimestamp="2026-03-20 11:33:19 +0000 UTC" firstStartedPulling="2026-03-20 11:33:20.891374548 +0000 UTC m=+2325.112735446" lastFinishedPulling="2026-03-20 11:33:23.349863798 +0000 UTC m=+2327.571224706" observedRunningTime="2026-03-20 11:33:23.938904678 +0000 UTC m=+2328.160265586" watchObservedRunningTime="2026-03-20 11:33:23.944515409 +0000 UTC m=+2328.165876307" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.919927 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.921064 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.978212 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:30 crc kubenswrapper[4860]: I0320 11:33:30.036718 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.036804 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.040134 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.055618 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073680 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073736 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175664 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175725 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175788 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.176412 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.176638 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.206325 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.382930 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.839483 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.988440 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerStarted","Data":"ace6e6791128e2eaff68c3272eb50bcf3fe0a2d8d4ae2e160cf51cf12052e981"} Mar 20 11:33:33 crc kubenswrapper[4860]: I0320 11:33:33.997642 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8" exitCode=0 Mar 20 11:33:33 crc kubenswrapper[4860]: I0320 11:33:33.998115 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8"} Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.008944 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63" exitCode=0 Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.009016 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63"} Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.224101 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.224419 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xphpt" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" containerID="cri-o://fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" gracePeriod=2 Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.681866 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740493 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740571 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740839 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.741600 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities" (OuterVolumeSpecName: "utilities") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.748271 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l" (OuterVolumeSpecName: "kube-api-access-hlv8l") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "kube-api-access-hlv8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.804982 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843473 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843526 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843540 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.021299 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerStarted","Data":"e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025000 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" exitCode=0 Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025093 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"983ed4b1339ec96d4ba41f7871568d90c8c49c98f3f71ee320948724affbf5ff"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025195 4860 scope.go:117] "RemoveContainer" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025121 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.053569 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qmpt" podStartSLOduration=2.445158218 podStartE2EDuration="4.053548217s" podCreationTimestamp="2026-03-20 11:33:32 +0000 UTC" firstStartedPulling="2026-03-20 11:33:34.002582301 +0000 UTC m=+2338.223943199" lastFinishedPulling="2026-03-20 11:33:35.6109723 +0000 UTC m=+2339.832333198" observedRunningTime="2026-03-20 11:33:36.048645525 +0000 UTC m=+2340.270006423" watchObservedRunningTime="2026-03-20 11:33:36.053548217 +0000 UTC m=+2340.274909115" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.055427 4860 scope.go:117] "RemoveContainer" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.072207 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.081572 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.109588 4860 scope.go:117] "RemoveContainer" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.132550 4860 scope.go:117] "RemoveContainer" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.133140 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": container with ID starting with fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08 not found: ID does not exist" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.133194 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} err="failed to get container status \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": rpc error: code = NotFound desc = could not find container \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": container with ID starting with fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08 not found: ID does not exist" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.133251 4860 scope.go:117] "RemoveContainer" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.135392 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": container with ID starting with 6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53 not found: ID does not exist" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.135429 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} err="failed to get container status \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": rpc error: code = NotFound desc = could not find container \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": container with ID starting with 6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53 not found: ID does not exist" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.135458 4860 scope.go:117] "RemoveContainer" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.135996 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": container with ID starting with 7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e not found: ID does not exist" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.136023 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e"} err="failed to get container status \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": rpc error: code = NotFound desc = could not find container \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": container with ID starting with 7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e not found: ID does not exist" Mar 20 11:33:37 crc kubenswrapper[4860]: I0320 11:33:37.418633 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:37 crc kubenswrapper[4860]: E0320 11:33:37.419513 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:37 crc kubenswrapper[4860]: I0320 11:33:37.424205 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" path="/var/lib/kubelet/pods/ca22abec-1b58-4b4d-a3a8-0744e4684074/volumes" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.383398 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.385416 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.448499 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:43 crc kubenswrapper[4860]: I0320 11:33:43.138832 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:43 crc kubenswrapper[4860]: I0320 11:33:43.197539 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:45 crc kubenswrapper[4860]: I0320 11:33:45.103530 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qmpt" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" containerID="cri-o://e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" gracePeriod=2 Mar 20 11:33:46 crc kubenswrapper[4860]: I0320 11:33:46.737006 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" exitCode=0 Mar 20 11:33:46 crc kubenswrapper[4860]: I0320 11:33:46.737492 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c"} Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.010301 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189685 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189727 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.191256 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities" (OuterVolumeSpecName: "utilities") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.202065 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt" (OuterVolumeSpecName: "kube-api-access-4qqkt") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "kube-api-access-4qqkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.231443 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292116 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292152 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292165 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753124 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"ace6e6791128e2eaff68c3272eb50bcf3fe0a2d8d4ae2e160cf51cf12052e981"} Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753240 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753595 4860 scope.go:117] "RemoveContainer" containerID="e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.780806 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.789631 4860 scope.go:117] "RemoveContainer" containerID="e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.794362 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.811494 4860 scope.go:117] "RemoveContainer" containerID="eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8" Mar 20 11:33:49 crc kubenswrapper[4860]: I0320 11:33:49.416154 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:49 crc kubenswrapper[4860]: E0320 11:33:49.416876 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:49 crc kubenswrapper[4860]: I0320 11:33:49.427912 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a43112-1781-421d-9123-971f77f6739e" path="/var/lib/kubelet/pods/b9a43112-1781-421d-9123-971f77f6739e/volumes" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.148797 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149823 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149840 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149858 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149865 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149887 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149895 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149907 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149914 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149936 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149943 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149958 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149965 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150119 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150140 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150836 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.157765 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160284 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160573 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160793 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.306170 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.408411 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.431711 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.483138 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.926274 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:01 crc kubenswrapper[4860]: I0320 11:34:01.414402 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:01 crc kubenswrapper[4860]: E0320 11:34:01.414694 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:01 crc kubenswrapper[4860]: I0320 11:34:01.876861 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerStarted","Data":"aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381"} Mar 20 11:34:02 crc kubenswrapper[4860]: I0320 11:34:02.886462 4860 generic.go:334] "Generic (PLEG): container finished" podID="34154cfb-cf91-40a0-8390-78823b11b698" containerID="bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7" exitCode=0 Mar 20 11:34:02 crc kubenswrapper[4860]: I0320 11:34:02.886565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerDied","Data":"bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7"} Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.174440 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.273192 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"34154cfb-cf91-40a0-8390-78823b11b698\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.280702 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248" (OuterVolumeSpecName: "kube-api-access-dd248") pod "34154cfb-cf91-40a0-8390-78823b11b698" (UID: "34154cfb-cf91-40a0-8390-78823b11b698"). InnerVolumeSpecName "kube-api-access-dd248". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.377200 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.902869 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerDied","Data":"aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381"} Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.903413 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.902934 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.265574 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.265679 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.425043 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" path="/var/lib/kubelet/pods/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2/volumes" Mar 20 11:34:13 crc kubenswrapper[4860]: I0320 11:34:13.414061 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:13 crc kubenswrapper[4860]: E0320 11:34:13.414997 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:26 crc kubenswrapper[4860]: I0320 11:34:26.414334 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:26 crc kubenswrapper[4860]: E0320 11:34:26.415388 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:39 crc kubenswrapper[4860]: I0320 11:34:39.413736 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:39 crc kubenswrapper[4860]: E0320 11:34:39.414878 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:48 crc kubenswrapper[4860]: I0320 11:34:48.803950 4860 scope.go:117] "RemoveContainer" containerID="43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f" Mar 20 11:34:54 crc kubenswrapper[4860]: I0320 11:34:54.415181 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:54 crc kubenswrapper[4860]: E0320 11:34:54.416027 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:08 crc kubenswrapper[4860]: I0320 11:35:08.413944 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:08 crc kubenswrapper[4860]: E0320 11:35:08.415754 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:22 crc kubenswrapper[4860]: I0320 11:35:22.414209 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:22 crc kubenswrapper[4860]: E0320 11:35:22.416163 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:34 crc kubenswrapper[4860]: I0320 11:35:34.413297 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:34 crc kubenswrapper[4860]: E0320 11:35:34.414456 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:45 crc kubenswrapper[4860]: I0320 11:35:45.413989 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:45 crc kubenswrapper[4860]: E0320 11:35:45.415161 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:57 crc kubenswrapper[4860]: I0320 11:35:57.418143 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:57 crc kubenswrapper[4860]: E0320 11:35:57.419408 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.145910 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: E0320 11:36:00.146411 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.146430 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.146642 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.147416 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150319 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150623 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150952 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.157522 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.217036 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.319028 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.352093 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.477722 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.915729 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: W0320 11:36:00.920043 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9943d6_a840_4ba5_b12c_9ebf3cbd1224.slice/crio-4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4 WatchSource:0}: Error finding container 4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4: Status 404 returned error can't find the container with id 4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4 Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.941702 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerStarted","Data":"4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4"} Mar 20 11:36:02 crc kubenswrapper[4860]: I0320 11:36:02.975006 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerStarted","Data":"d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf"} Mar 20 11:36:03 crc kubenswrapper[4860]: I0320 11:36:03.983810 4860 generic.go:334] "Generic (PLEG): container finished" podID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerID="d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf" exitCode=0 Mar 20 11:36:03 crc kubenswrapper[4860]: I0320 11:36:03.983873 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerDied","Data":"d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf"} Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.315088 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.490948 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.497870 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2" (OuterVolumeSpecName: "kube-api-access-2l9r2") pod "fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" (UID: "fd9943d6-a840-4ba5-b12c-9ebf3cbd1224"). InnerVolumeSpecName "kube-api-access-2l9r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.592523 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993291 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerDied","Data":"4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4"} Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993845 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993344 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.397548 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.403805 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.424624 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" path="/var/lib/kubelet/pods/d911fad2-83cb-46e3-8f48-eb6f4b0e5605/volumes" Mar 20 11:36:09 crc kubenswrapper[4860]: I0320 11:36:09.414360 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:09 crc kubenswrapper[4860]: E0320 11:36:09.415181 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:24 crc kubenswrapper[4860]: I0320 11:36:24.415689 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:24 crc kubenswrapper[4860]: E0320 11:36:24.417383 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:38 crc kubenswrapper[4860]: I0320 11:36:38.413125 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:38 crc kubenswrapper[4860]: E0320 11:36:38.414096 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:48 crc kubenswrapper[4860]: I0320 11:36:48.898309 4860 scope.go:117] "RemoveContainer" containerID="6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a" Mar 20 11:36:49 crc kubenswrapper[4860]: I0320 11:36:49.413870 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:49 crc kubenswrapper[4860]: E0320 11:36:49.414488 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:02 crc kubenswrapper[4860]: I0320 11:37:02.414037 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:02 crc kubenswrapper[4860]: E0320 11:37:02.414907 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:17 crc kubenswrapper[4860]: I0320 11:37:17.419085 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:17 crc kubenswrapper[4860]: E0320 11:37:17.421972 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:28 crc kubenswrapper[4860]: I0320 11:37:28.414135 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:28 crc kubenswrapper[4860]: E0320 11:37:28.416856 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:39 crc kubenswrapper[4860]: I0320 11:37:39.414026 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:39 crc kubenswrapper[4860]: E0320 11:37:39.415243 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:52 crc kubenswrapper[4860]: I0320 11:37:52.413689 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:52 crc kubenswrapper[4860]: I0320 11:37:52.895623 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.152624 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:00 crc kubenswrapper[4860]: E0320 11:38:00.157416 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.157743 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.158122 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.158993 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.162188 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.162290 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.163747 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.164671 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.173141 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.274044 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.296929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.482602 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.953908 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:01 crc kubenswrapper[4860]: I0320 11:38:01.974669 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerStarted","Data":"829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151"} Mar 20 11:38:02 crc kubenswrapper[4860]: I0320 11:38:02.985506 4860 generic.go:334] "Generic (PLEG): container finished" podID="04737161-e8a4-4231-8b96-1a617b9561a7" containerID="2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd" exitCode=0 Mar 20 11:38:02 crc kubenswrapper[4860]: I0320 11:38:02.985590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerDied","Data":"2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd"} Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.309731 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.443849 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"04737161-e8a4-4231-8b96-1a617b9561a7\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.451002 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk" (OuterVolumeSpecName: "kube-api-access-ck5pk") pod "04737161-e8a4-4231-8b96-1a617b9561a7" (UID: "04737161-e8a4-4231-8b96-1a617b9561a7"). InnerVolumeSpecName "kube-api-access-ck5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.546478 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") on node \"crc\" DevicePath \"\"" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023630 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerDied","Data":"829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151"} Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023690 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023701 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.388552 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.395263 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.422850 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22a4be9-9edf-4029-b504-f5c059318959" path="/var/lib/kubelet/pods/e22a4be9-9edf-4029-b504-f5c059318959/volumes" Mar 20 11:38:49 crc kubenswrapper[4860]: I0320 11:38:49.021407 4860 scope.go:117] "RemoveContainer" containerID="954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea" Mar 20 11:39:52 crc kubenswrapper[4860]: I0320 11:39:52.344737 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:39:52 crc kubenswrapper[4860]: I0320 11:39:52.345837 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.154512 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: E0320 11:40:00.156498 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.156521 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.156747 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.157566 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.159860 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.160788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.160976 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.178694 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.263055 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.364918 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.388819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.485010 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.942121 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.954031 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:40:01 crc kubenswrapper[4860]: I0320 11:40:01.957522 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerStarted","Data":"2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3"} Mar 20 11:40:02 crc kubenswrapper[4860]: I0320 11:40:02.969147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerStarted","Data":"7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd"} Mar 20 11:40:03 crc kubenswrapper[4860]: I0320 11:40:03.978427 4860 generic.go:334] "Generic (PLEG): container finished" podID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerID="7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd" exitCode=0 Mar 20 11:40:03 crc kubenswrapper[4860]: I0320 11:40:03.978493 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerDied","Data":"7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd"} Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.274937 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.446979 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.455051 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx" (OuterVolumeSpecName: "kube-api-access-c8dwx") pod "0f75e65d-773b-4474-985a-2ca6fea0dc6a" (UID: "0f75e65d-773b-4474-985a-2ca6fea0dc6a"). InnerVolumeSpecName "kube-api-access-c8dwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.549078 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerDied","Data":"2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3"} Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988648 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988670 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3" Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.359930 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.366657 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.422436 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34154cfb-cf91-40a0-8390-78823b11b698" path="/var/lib/kubelet/pods/34154cfb-cf91-40a0-8390-78823b11b698/volumes" Mar 20 11:40:22 crc kubenswrapper[4860]: I0320 11:40:22.343795 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:22 crc kubenswrapper[4860]: I0320 11:40:22.344708 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:49 crc kubenswrapper[4860]: I0320 11:40:49.111338 4860 scope.go:117] "RemoveContainer" containerID="bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.343895 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.344507 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.344570 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.345482 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.345542 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" gracePeriod=600 Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393057 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" exitCode=0 Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393133 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393898 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393945 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.621694 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:34 crc kubenswrapper[4860]: E0320 11:41:34.622974 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.622990 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.623157 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.624609 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.636631 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.790862 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.791520 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.791582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893249 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893330 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893980 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.894184 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.921748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.948252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:35 crc kubenswrapper[4860]: I0320 11:41:35.525383 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:35 crc kubenswrapper[4860]: I0320 11:41:35.718506 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerStarted","Data":"e5ac5ddd5c069f37405eb54fe0ce2a92994ce39706f8b4d4aa25212406b4a77b"} Mar 20 11:41:36 crc kubenswrapper[4860]: I0320 11:41:36.731091 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" exitCode=0 Mar 20 11:41:36 crc kubenswrapper[4860]: I0320 11:41:36.731175 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179"} Mar 20 11:41:39 crc kubenswrapper[4860]: I0320 11:41:39.757512 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" exitCode=0 Mar 20 11:41:39 crc kubenswrapper[4860]: I0320 11:41:39.757589 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8"} Mar 20 11:41:42 crc kubenswrapper[4860]: I0320 11:41:42.801699 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerStarted","Data":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} Mar 20 11:41:42 crc kubenswrapper[4860]: I0320 11:41:42.825788 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7pbqv" podStartSLOduration=3.512832007 podStartE2EDuration="8.825764183s" podCreationTimestamp="2026-03-20 11:41:34 +0000 UTC" firstStartedPulling="2026-03-20 11:41:36.734061346 +0000 UTC m=+2820.955422244" lastFinishedPulling="2026-03-20 11:41:42.046993522 +0000 UTC m=+2826.268354420" observedRunningTime="2026-03-20 11:41:42.825742172 +0000 UTC m=+2827.047103080" watchObservedRunningTime="2026-03-20 11:41:42.825764183 +0000 UTC m=+2827.047125081" Mar 20 11:41:44 crc kubenswrapper[4860]: I0320 11:41:44.948698 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:44 crc kubenswrapper[4860]: I0320 11:41:44.949159 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:45 crc kubenswrapper[4860]: I0320 11:41:45.001740 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:54 crc kubenswrapper[4860]: I0320 11:41:54.999027 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:55 crc kubenswrapper[4860]: I0320 11:41:55.060249 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:55 crc kubenswrapper[4860]: I0320 11:41:55.900010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7pbqv" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" containerID="cri-o://1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" gracePeriod=2 Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.348702 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.482801 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.482959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.491448 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.492453 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities" (OuterVolumeSpecName: "utilities") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.495498 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f" (OuterVolumeSpecName: "kube-api-access-hrf7f") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "kube-api-access-hrf7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.540915 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601541 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601577 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601589 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910257 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" exitCode=0 Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910312 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910355 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"e5ac5ddd5c069f37405eb54fe0ce2a92994ce39706f8b4d4aa25212406b4a77b"} Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910375 4860 scope.go:117] "RemoveContainer" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910379 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.935770 4860 scope.go:117] "RemoveContainer" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.950886 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.958448 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.961116 4860 scope.go:117] "RemoveContainer" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.988307 4860 scope.go:117] "RemoveContainer" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.988993 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": container with ID starting with 1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0 not found: ID does not exist" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989077 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} err="failed to get container status \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": rpc error: code = NotFound desc = could not find container \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": container with ID starting with 1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0 not found: ID does not exist" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989129 4860 scope.go:117] "RemoveContainer" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.989540 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": container with ID starting with 75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8 not found: ID does not exist" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989588 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8"} err="failed to get container status \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": rpc error: code = NotFound desc = could not find container \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": container with ID starting with 75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8 not found: ID does not exist" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989625 4860 scope.go:117] "RemoveContainer" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.989992 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": container with ID starting with e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179 not found: ID does not exist" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.990029 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179"} err="failed to get container status \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": rpc error: code = NotFound desc = could not find container \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": container with ID starting with e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179 not found: ID does not exist" Mar 20 11:41:57 crc kubenswrapper[4860]: I0320 11:41:57.424660 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2af21af-50b4-4c92-9d39-40c326084305" path="/var/lib/kubelet/pods/c2af21af-50b4-4c92-9d39-40c326084305/volumes" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.151507 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.152942 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.152966 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.152983 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.152994 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.153004 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153016 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153262 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158134 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158287 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158489 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.168739 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.259692 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.361608 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.387520 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.477879 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.985237 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:01 crc kubenswrapper[4860]: I0320 11:42:01.955118 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerStarted","Data":"98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a"} Mar 20 11:42:02 crc kubenswrapper[4860]: I0320 11:42:02.964686 4860 generic.go:334] "Generic (PLEG): container finished" podID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerID="9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855" exitCode=0 Mar 20 11:42:02 crc kubenswrapper[4860]: I0320 11:42:02.964867 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerDied","Data":"9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855"} Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.266264 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.435620 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.443998 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7" (OuterVolumeSpecName: "kube-api-access-wpxk7") pod "cfc74c40-1bf8-47fb-91a4-a6e27724dff9" (UID: "cfc74c40-1bf8-47fb-91a4-a6e27724dff9"). InnerVolumeSpecName "kube-api-access-wpxk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.537861 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982317 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerDied","Data":"98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a"} Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982719 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982382 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.358431 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.377148 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.422954 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" path="/var/lib/kubelet/pods/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224/volumes" Mar 20 11:42:49 crc kubenswrapper[4860]: I0320 11:42:49.214213 4860 scope.go:117] "RemoveContainer" containerID="d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf" Mar 20 11:42:52 crc kubenswrapper[4860]: I0320 11:42:52.343957 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:42:52 crc kubenswrapper[4860]: I0320 11:42:52.344434 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.496631 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:02 crc kubenswrapper[4860]: E0320 11:43:02.498394 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.498410 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.498582 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.499736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.513527 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621301 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621407 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621531 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722847 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722922 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722952 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.723524 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.723641 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.745884 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.819775 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.160699 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475409 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" exitCode=0 Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475931 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d"} Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475969 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"8c2d1e411a59750e89077ddc91c2309ab3e7983b7e0eb52dca2b617cef84b7ff"} Mar 20 11:43:05 crc kubenswrapper[4860]: I0320 11:43:05.494069 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} Mar 20 11:43:06 crc kubenswrapper[4860]: I0320 11:43:06.505159 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" exitCode=0 Mar 20 11:43:06 crc kubenswrapper[4860]: I0320 11:43:06.505255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} Mar 20 11:43:07 crc kubenswrapper[4860]: I0320 11:43:07.518443 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} Mar 20 11:43:07 crc kubenswrapper[4860]: I0320 11:43:07.551688 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzc9q" podStartSLOduration=1.998661717 podStartE2EDuration="5.551665412s" podCreationTimestamp="2026-03-20 11:43:02 +0000 UTC" firstStartedPulling="2026-03-20 11:43:03.477859127 +0000 UTC m=+2907.699220025" lastFinishedPulling="2026-03-20 11:43:07.030862782 +0000 UTC m=+2911.252223720" observedRunningTime="2026-03-20 11:43:07.545343712 +0000 UTC m=+2911.766704610" watchObservedRunningTime="2026-03-20 11:43:07.551665412 +0000 UTC m=+2911.773026311" Mar 20 11:43:12 crc kubenswrapper[4860]: I0320 11:43:12.820078 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:12 crc kubenswrapper[4860]: I0320 11:43:12.823515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:13 crc kubenswrapper[4860]: I0320 11:43:13.880105 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fzc9q" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" probeResult="failure" output=< Mar 20 11:43:13 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:43:13 crc kubenswrapper[4860]: > Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.344071 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.344645 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.870997 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.925256 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:23 crc kubenswrapper[4860]: I0320 11:43:23.110490 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:24 crc kubenswrapper[4860]: I0320 11:43:24.659543 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzc9q" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" containerID="cri-o://de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" gracePeriod=2 Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.085969 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133804 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133848 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.134991 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities" (OuterVolumeSpecName: "utilities") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.143633 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj" (OuterVolumeSpecName: "kube-api-access-jwdmj") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "kube-api-access-jwdmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.236780 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.236833 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.272555 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.337948 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671866 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" exitCode=0 Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671908 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671972 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"8c2d1e411a59750e89077ddc91c2309ab3e7983b7e0eb52dca2b617cef84b7ff"} Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671993 4860 scope.go:117] "RemoveContainer" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.672034 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.692960 4860 scope.go:117] "RemoveContainer" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.702970 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.708999 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.718931 4860 scope.go:117] "RemoveContainer" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737050 4860 scope.go:117] "RemoveContainer" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.737582 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": container with ID starting with de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578 not found: ID does not exist" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737619 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} err="failed to get container status \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": rpc error: code = NotFound desc = could not find container \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": container with ID starting with de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578 not found: ID does not exist" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737726 4860 scope.go:117] "RemoveContainer" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.738099 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": container with ID starting with f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda not found: ID does not exist" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738136 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} err="failed to get container status \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": rpc error: code = NotFound desc = could not find container \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": container with ID starting with f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda not found: ID does not exist" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738154 4860 scope.go:117] "RemoveContainer" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.738454 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": container with ID starting with 360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d not found: ID does not exist" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738500 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d"} err="failed to get container status \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": rpc error: code = NotFound desc = could not find container \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": container with ID starting with 360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d not found: ID does not exist" Mar 20 11:43:27 crc kubenswrapper[4860]: I0320 11:43:27.426789 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" path="/var/lib/kubelet/pods/83074f2a-d218-47f7-8c37-8b5195f77210/volumes" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.344602 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.345547 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.345606 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.346374 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.346448 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" gracePeriod=600 Mar 20 11:43:52 crc kubenswrapper[4860]: E0320 11:43:52.468768 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058127 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" exitCode=0 Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058260 4860 scope.go:117] "RemoveContainer" containerID="855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058828 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.059062 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.554875 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555306 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555321 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555341 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-utilities" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555347 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-utilities" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555354 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-content" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555362 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-content" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555493 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.556656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.570904 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708502 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708526 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809541 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809644 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.810071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.810187 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.836098 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.875648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:54 crc kubenswrapper[4860]: I0320 11:43:54.372628 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.102680 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" exitCode=0 Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.102912 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182"} Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.103165 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"8c98d7e4ab87dca42cddfef62a946a203e0380f2d3bb440c1fe269c8f7a89579"} Mar 20 11:43:56 crc kubenswrapper[4860]: I0320 11:43:56.115850 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} Mar 20 11:43:57 crc kubenswrapper[4860]: I0320 11:43:57.123788 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" exitCode=0 Mar 20 11:43:57 crc kubenswrapper[4860]: I0320 11:43:57.123840 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} Mar 20 11:43:58 crc kubenswrapper[4860]: I0320 11:43:58.156658 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} Mar 20 11:43:58 crc kubenswrapper[4860]: I0320 11:43:58.189787 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d78wv" podStartSLOduration=2.775757859 podStartE2EDuration="5.189761093s" podCreationTimestamp="2026-03-20 11:43:53 +0000 UTC" firstStartedPulling="2026-03-20 11:43:55.105542007 +0000 UTC m=+2959.326902905" lastFinishedPulling="2026-03-20 11:43:57.519545241 +0000 UTC m=+2961.740906139" observedRunningTime="2026-03-20 11:43:58.181893921 +0000 UTC m=+2962.403254839" watchObservedRunningTime="2026-03-20 11:43:58.189761093 +0000 UTC m=+2962.411121991" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.171711 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.173057 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.176272 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.176749 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.180685 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.197262 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.217441 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.319103 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.348288 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.490587 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.953978 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:01 crc kubenswrapper[4860]: I0320 11:44:01.180187 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerStarted","Data":"ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a"} Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.197423 4860 generic.go:334] "Generic (PLEG): container finished" podID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerID="7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452" exitCode=0 Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.197538 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerDied","Data":"7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452"} Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.877462 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.877545 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.931970 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.254296 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.324254 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.496045 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.705861 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.715479 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4" (OuterVolumeSpecName: "kube-api-access-dpdx4") pod "1d3485ef-9918-4aa6-80d1-c1c295d46ebe" (UID: "1d3485ef-9918-4aa6-80d1-c1c295d46ebe"). InnerVolumeSpecName "kube-api-access-dpdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.807742 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216563 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerDied","Data":"ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a"} Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.413746 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:05 crc kubenswrapper[4860]: E0320 11:44:05.413985 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.597651 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.604687 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.223653 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d78wv" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" containerID="cri-o://d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" gracePeriod=2 Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.635357 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.737705 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.737842 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.738006 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.739040 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities" (OuterVolumeSpecName: "utilities") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.744814 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f" (OuterVolumeSpecName: "kube-api-access-gxl7f") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "kube-api-access-gxl7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.831857 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839710 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839765 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839776 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235409 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" exitCode=0 Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235471 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235513 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"8c98d7e4ab87dca42cddfef62a946a203e0380f2d3bb440c1fe269c8f7a89579"} Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235542 4860 scope.go:117] "RemoveContainer" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235553 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.263802 4860 scope.go:117] "RemoveContainer" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.274499 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.282690 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.302982 4860 scope.go:117] "RemoveContainer" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.323393 4860 scope.go:117] "RemoveContainer" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.325153 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": container with ID starting with d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c not found: ID does not exist" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325195 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} err="failed to get container status \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": rpc error: code = NotFound desc = could not find container \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": container with ID starting with d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325235 4860 scope.go:117] "RemoveContainer" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.325729 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": container with ID starting with 0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176 not found: ID does not exist" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325760 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} err="failed to get container status \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": rpc error: code = NotFound desc = could not find container \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": container with ID starting with 0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176 not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325778 4860 scope.go:117] "RemoveContainer" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.326116 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": container with ID starting with 475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182 not found: ID does not exist" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.326139 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182"} err="failed to get container status \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": rpc error: code = NotFound desc = could not find container \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": container with ID starting with 475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182 not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.428129 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" path="/var/lib/kubelet/pods/04737161-e8a4-4231-8b96-1a617b9561a7/volumes" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.430201 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" path="/var/lib/kubelet/pods/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a/volumes" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.051383 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052815 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-utilities" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052833 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-utilities" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052866 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-content" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052872 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-content" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052886 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052897 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052914 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052920 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.053082 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.053098 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.054528 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057350 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.065901 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158836 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.159583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.159687 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.183610 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.377577 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.675968 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310390 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" exitCode=0 Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310456 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d"} Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310861 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"9386bd05b36fdd8edc89df0d5cc27cc7e138833813ff37cf522c6edb5dcf33b4"} Mar 20 11:44:16 crc kubenswrapper[4860]: I0320 11:44:16.320551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} Mar 20 11:44:17 crc kubenswrapper[4860]: I0320 11:44:17.336690 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" exitCode=0 Mar 20 11:44:17 crc kubenswrapper[4860]: I0320 11:44:17.336746 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} Mar 20 11:44:18 crc kubenswrapper[4860]: I0320 11:44:18.348751 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} Mar 20 11:44:18 crc kubenswrapper[4860]: I0320 11:44:18.369925 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-597xm" podStartSLOduration=1.8734772849999999 podStartE2EDuration="4.369896927s" podCreationTimestamp="2026-03-20 11:44:14 +0000 UTC" firstStartedPulling="2026-03-20 11:44:15.312115844 +0000 UTC m=+2979.533476742" lastFinishedPulling="2026-03-20 11:44:17.808535486 +0000 UTC m=+2982.029896384" observedRunningTime="2026-03-20 11:44:18.366707 +0000 UTC m=+2982.588067908" watchObservedRunningTime="2026-03-20 11:44:18.369896927 +0000 UTC m=+2982.591257825" Mar 20 11:44:20 crc kubenswrapper[4860]: I0320 11:44:20.413677 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:20 crc kubenswrapper[4860]: E0320 11:44:20.414530 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.378574 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.379278 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.421797 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.471583 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.753386 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.407844 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-597xm" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" containerID="cri-o://46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" gracePeriod=2 Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.814861 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966172 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966837 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966867 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.968717 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities" (OuterVolumeSpecName: "utilities") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.987010 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l" (OuterVolumeSpecName: "kube-api-access-tbl7l") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "kube-api-access-tbl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.033554 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069304 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069367 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069388 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.419765 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" exitCode=0 Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.419902 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.432823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.433654 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"9386bd05b36fdd8edc89df0d5cc27cc7e138833813ff37cf522c6edb5dcf33b4"} Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.433757 4860 scope.go:117] "RemoveContainer" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.459127 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.462306 4860 scope.go:117] "RemoveContainer" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.466261 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.484240 4860 scope.go:117] "RemoveContainer" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.507370 4860 scope.go:117] "RemoveContainer" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.508039 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": container with ID starting with 46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0 not found: ID does not exist" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508117 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} err="failed to get container status \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": rpc error: code = NotFound desc = could not find container \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": container with ID starting with 46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0 not found: ID does not exist" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508150 4860 scope.go:117] "RemoveContainer" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.508569 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": container with ID starting with ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f not found: ID does not exist" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508613 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} err="failed to get container status \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": rpc error: code = NotFound desc = could not find container \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": container with ID starting with ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f not found: ID does not exist" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508638 4860 scope.go:117] "RemoveContainer" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.509176 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": container with ID starting with 13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d not found: ID does not exist" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.509255 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d"} err="failed to get container status \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": rpc error: code = NotFound desc = could not find container \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": container with ID starting with 13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d not found: ID does not exist" Mar 20 11:44:29 crc kubenswrapper[4860]: I0320 11:44:29.423011 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" path="/var/lib/kubelet/pods/bdad63c3-dc15-41fd-acbc-6451e3dfea6b/volumes" Mar 20 11:44:34 crc kubenswrapper[4860]: I0320 11:44:34.413136 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:34 crc kubenswrapper[4860]: E0320 11:44:34.414319 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:46 crc kubenswrapper[4860]: I0320 11:44:46.414956 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:46 crc kubenswrapper[4860]: E0320 11:44:46.416193 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:49 crc kubenswrapper[4860]: I0320 11:44:49.327540 4860 scope.go:117] "RemoveContainer" containerID="2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd" Mar 20 11:44:57 crc kubenswrapper[4860]: I0320 11:44:57.413668 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:57 crc kubenswrapper[4860]: E0320 11:44:57.414404 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.157108 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158101 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-utilities" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158122 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-utilities" Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158155 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158164 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158178 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-content" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158187 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-content" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158380 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.160110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.162954 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.163787 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.170381 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.262826 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.263392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.263554 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365014 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365079 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365117 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.366367 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.373463 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.385446 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.494395 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.767763 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.968096 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerStarted","Data":"06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688"} Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.968171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerStarted","Data":"f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259"} Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.993266 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" podStartSLOduration=0.993223547 podStartE2EDuration="993.223547ms" podCreationTimestamp="2026-03-20 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:45:00.985263282 +0000 UTC m=+3025.206624200" watchObservedRunningTime="2026-03-20 11:45:00.993223547 +0000 UTC m=+3025.214584445" Mar 20 11:45:01 crc kubenswrapper[4860]: I0320 11:45:01.978716 4860 generic.go:334] "Generic (PLEG): container finished" podID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerID="06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688" exitCode=0 Mar 20 11:45:01 crc kubenswrapper[4860]: I0320 11:45:01.978797 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerDied","Data":"06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688"} Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.263975 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.413777 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.413921 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.414019 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.414380 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.421624 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.422806 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4" (OuterVolumeSpecName: "kube-api-access-nktf4") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "kube-api-access-nktf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516499 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516548 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516561 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997865 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerDied","Data":"f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259"} Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997940 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997953 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:04 crc kubenswrapper[4860]: I0320 11:45:04.347896 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:45:04 crc kubenswrapper[4860]: I0320 11:45:04.353042 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:45:05 crc kubenswrapper[4860]: I0320 11:45:05.424759 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" path="/var/lib/kubelet/pods/41a09ead-8137-4791-896c-c5a9cad7f4cf/volumes" Mar 20 11:45:09 crc kubenswrapper[4860]: I0320 11:45:09.413241 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:09 crc kubenswrapper[4860]: E0320 11:45:09.414264 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:24 crc kubenswrapper[4860]: I0320 11:45:24.414107 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:24 crc kubenswrapper[4860]: E0320 11:45:24.416444 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:39 crc kubenswrapper[4860]: I0320 11:45:39.415014 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:39 crc kubenswrapper[4860]: E0320 11:45:39.416803 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:49 crc kubenswrapper[4860]: I0320 11:45:49.424944 4860 scope.go:117] "RemoveContainer" containerID="728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c" Mar 20 11:45:50 crc kubenswrapper[4860]: I0320 11:45:50.414064 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:50 crc kubenswrapper[4860]: E0320 11:45:50.414816 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.153727 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: E0320 11:46:00.158118 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.158185 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.158524 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.159332 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.162263 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.162462 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.163883 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.164292 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.318891 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.420916 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.448343 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.486501 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.921293 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.928527 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:46:01 crc kubenswrapper[4860]: I0320 11:46:01.474681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerStarted","Data":"1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b"} Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.413925 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:02 crc kubenswrapper[4860]: E0320 11:46:02.414864 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.485539 4860 generic.go:334] "Generic (PLEG): container finished" podID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerID="02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e" exitCode=0 Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.485630 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerDied","Data":"02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e"} Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.758547 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.879707 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"a69dccb3-324d-47b3-92d0-af9fc224932d\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.886148 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv" (OuterVolumeSpecName: "kube-api-access-6jjbv") pod "a69dccb3-324d-47b3-92d0-af9fc224932d" (UID: "a69dccb3-324d-47b3-92d0-af9fc224932d"). InnerVolumeSpecName "kube-api-access-6jjbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.981521 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") on node \"crc\" DevicePath \"\"" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerDied","Data":"1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b"} Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501573 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501347 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.843278 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.849317 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:46:05 crc kubenswrapper[4860]: I0320 11:46:05.426007 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" path="/var/lib/kubelet/pods/0f75e65d-773b-4474-985a-2ca6fea0dc6a/volumes" Mar 20 11:46:13 crc kubenswrapper[4860]: I0320 11:46:13.413802 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:13 crc kubenswrapper[4860]: E0320 11:46:13.415178 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:28 crc kubenswrapper[4860]: I0320 11:46:28.413529 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:28 crc kubenswrapper[4860]: E0320 11:46:28.416673 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:40 crc kubenswrapper[4860]: I0320 11:46:40.413880 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:40 crc kubenswrapper[4860]: E0320 11:46:40.415096 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:49 crc kubenswrapper[4860]: I0320 11:46:49.489121 4860 scope.go:117] "RemoveContainer" containerID="7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd" Mar 20 11:46:55 crc kubenswrapper[4860]: I0320 11:46:55.413961 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:55 crc kubenswrapper[4860]: E0320 11:46:55.416496 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.335085 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: E0320 11:47:02.336034 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.336047 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.336193 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.337002 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.343505 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-972zc"/"kube-root-ca.crt" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.345005 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-972zc"/"openshift-service-ca.crt" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.444009 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.450206 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.450433 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.551980 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.552105 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.552740 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.573558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.658039 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.965741 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: W0320 11:47:02.980731 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537f47a7_01d4_449a_8afc_a83a212f4bc5.slice/crio-cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939 WatchSource:0}: Error finding container cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939: Status 404 returned error can't find the container with id cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939 Mar 20 11:47:03 crc kubenswrapper[4860]: I0320 11:47:03.989249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939"} Mar 20 11:47:07 crc kubenswrapper[4860]: I0320 11:47:07.419388 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:07 crc kubenswrapper[4860]: E0320 11:47:07.420705 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:11 crc kubenswrapper[4860]: I0320 11:47:11.072792 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba"} Mar 20 11:47:11 crc kubenswrapper[4860]: I0320 11:47:11.073716 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} Mar 20 11:47:12 crc kubenswrapper[4860]: I0320 11:47:12.099387 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-972zc/must-gather-g5zkg" podStartSLOduration=2.451706799 podStartE2EDuration="10.099363633s" podCreationTimestamp="2026-03-20 11:47:02 +0000 UTC" firstStartedPulling="2026-03-20 11:47:02.982961748 +0000 UTC m=+3147.204322646" lastFinishedPulling="2026-03-20 11:47:10.630618582 +0000 UTC m=+3154.851979480" observedRunningTime="2026-03-20 11:47:12.095743475 +0000 UTC m=+3156.317104383" watchObservedRunningTime="2026-03-20 11:47:12.099363633 +0000 UTC m=+3156.320724521" Mar 20 11:47:18 crc kubenswrapper[4860]: I0320 11:47:18.414272 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:18 crc kubenswrapper[4860]: E0320 11:47:18.415434 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:33 crc kubenswrapper[4860]: I0320 11:47:33.414249 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:33 crc kubenswrapper[4860]: E0320 11:47:33.415591 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:48 crc kubenswrapper[4860]: I0320 11:47:48.413582 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:48 crc kubenswrapper[4860]: E0320 11:47:48.414732 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.120820 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/init/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.154317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.155714 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.160282 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.160498 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.161093 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.173597 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.262582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.359285 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/init/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.363530 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/dnsmasq-dns/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.364750 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.392825 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.482044 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.775210 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: W0320 11:48:00.786204 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f342006_66c4_4bc6_9577_1aa4db4b4210.slice/crio-57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df WatchSource:0}: Error finding container 57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df: Status 404 returned error can't find the container with id 57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df Mar 20 11:48:01 crc kubenswrapper[4860]: I0320 11:48:01.656528 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerStarted","Data":"57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df"} Mar 20 11:48:02 crc kubenswrapper[4860]: I0320 11:48:02.666387 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerStarted","Data":"e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2"} Mar 20 11:48:02 crc kubenswrapper[4860]: I0320 11:48:02.696777 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" podStartSLOduration=1.64140723 podStartE2EDuration="2.696753959s" podCreationTimestamp="2026-03-20 11:48:00 +0000 UTC" firstStartedPulling="2026-03-20 11:48:00.789023376 +0000 UTC m=+3205.010384274" lastFinishedPulling="2026-03-20 11:48:01.844370105 +0000 UTC m=+3206.065731003" observedRunningTime="2026-03-20 11:48:02.687290304 +0000 UTC m=+3206.908651202" watchObservedRunningTime="2026-03-20 11:48:02.696753959 +0000 UTC m=+3206.918114857" Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.414063 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:03 crc kubenswrapper[4860]: E0320 11:48:03.414321 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.676723 4860 generic.go:334] "Generic (PLEG): container finished" podID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerID="e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2" exitCode=0 Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.676791 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerDied","Data":"e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2"} Mar 20 11:48:04 crc kubenswrapper[4860]: I0320 11:48:04.981273 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.037922 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"0f342006-66c4-4bc6-9577-1aa4db4b4210\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.046213 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj" (OuterVolumeSpecName: "kube-api-access-7j5jj") pod "0f342006-66c4-4bc6-9577-1aa4db4b4210" (UID: "0f342006-66c4-4bc6-9577-1aa4db4b4210"). InnerVolumeSpecName "kube-api-access-7j5jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.140357 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694425 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerDied","Data":"57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df"} Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694494 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694518 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.781745 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.788660 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:48:07 crc kubenswrapper[4860]: I0320 11:48:07.424030 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" path="/var/lib/kubelet/pods/cfc74c40-1bf8-47fb-91a4-a6e27724dff9/volumes" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.246028 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.465986 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.467568 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.469078 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.715135 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.719727 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.720411 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/extract/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.905745 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-8dh72_8b4d2530-4f67-45e8-9444-bea25fdad6ae/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.167336 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-2692b_20d35dc6-0fc2-4651-9dcd-855814132a5f/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.310642 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-zphz9_5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.413259 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:16 crc kubenswrapper[4860]: E0320 11:48:16.413556 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.508173 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-vw2d9_36138670-7449-4d49-8a23-73b57d10b67f/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.568026 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-s2kwq_178fff2d-699c-4cab-8626-3e30a6bd9ed6/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.683605 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-wfczk_c54f27c4-bd61-4bad-bf91-376fee65d219/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.833873 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-njzqs_70703379-8eb2-4f8a-95c8-302b53692a53/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.900333 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-mc48w_acf57205-3b95-48a3-8222-1b57b0b6c54b/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.051262 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-pq75b_fbbe8243-9afb-4fc5-90f1-04d6f0c074ef/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.120420 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-pzk5m_0fe9b978-da91-4568-9b77-0d5930aca888/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.272917 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-m8948_d7202366-6dc1-45ca-bb9a-74bdd0426c5f/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.327064 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-2vsjq_29801d0c-963e-4b38-ad2d-8b03d3ade0be/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.526524 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-z8fp5_6c2530cf-70b4-4a89-acff-086b36773edf/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.581518 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-tjt52_431ab970-7f36-4ace-860c-479faac092a0/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.784440 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-5tqgx_ecf64e38-138d-4ef7-8b17-c09f30358f3e/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.982948 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-846ffbb776-dppd5_31f3fcff-ca2c-40b5-bdf3-018132ccb63b/operator/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.089759 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6697dffbc-hpk42_84431296-0ca0-425a-8da8-c3ea46b08b29/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.238193 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-82h6r_f7193309-39f9-4487-b02b-8e9e4d6a69ff/registry-server/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.308408 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-4nk5c_c736e6d7-6806-4ef3-a0b3-f1b17ab33037/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.454209 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-4tdg4_7f73053a-86aa-42dc-bcca-ee26a4fda2e5/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.515008 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-dvptb_cce5926a-9df6-4915-a94f-02cf2f74fccc/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.689109 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jd9bn_b5e881e2-f657-418f-ba87-7074722307a2/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.752308 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-b4zcf_f329ab6d-5c8c-4ed2-a830-d0a04bb31071/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.922916 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-ncmzn_1723efcf-97d7-4101-a15d-d4776d45d29b/manager/0.log" Mar 20 11:48:28 crc kubenswrapper[4860]: I0320 11:48:28.414042 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:28 crc kubenswrapper[4860]: E0320 11:48:28.415147 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.572763 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jnqsw_8bc351c5-b724-443e-a7e2-f4abba352cef/control-plane-machine-set-operator/0.log" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.745972 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s52jd_d4ce1856-395a-4003-9642-61da7cbdd789/kube-rbac-proxy/0.log" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.805209 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s52jd_d4ce1856-395a-4003-9642-61da7cbdd789/machine-api-operator/0.log" Mar 20 11:48:41 crc kubenswrapper[4860]: I0320 11:48:41.417620 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:41 crc kubenswrapper[4860]: E0320 11:48:41.418380 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:49 crc kubenswrapper[4860]: I0320 11:48:49.594173 4860 scope.go:117] "RemoveContainer" containerID="9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.535498 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-x6lwp_74ed19c1-0e46-4fed-b50f-155eaa38aed9/cert-manager-controller/0.log" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.734649 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-sjz7s_aa7d8eaa-20ac-4ea3-b19d-e8f89054c619/cert-manager-cainjector/0.log" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.861391 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-skhrl_4587c778-c12c-48e0-8c28-7eb7a7c1b722/cert-manager-webhook/0.log" Mar 20 11:48:53 crc kubenswrapper[4860]: I0320 11:48:53.414032 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:54 crc kubenswrapper[4860]: I0320 11:48:54.071665 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} Mar 20 11:49:04 crc kubenswrapper[4860]: I0320 11:49:04.685248 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-l98lx_a91c6f2b-7646-4f4d-bdc2-47304e36da4e/nmstate-console-plugin/0.log" Mar 20 11:49:04 crc kubenswrapper[4860]: I0320 11:49:04.877568 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mdh82_ef7f3b63-3a7d-483b-95c1-32961dad6226/nmstate-handler/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.015615 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wr9vc_6f56c0b5-3d27-49e6-af5b-6ad929d9e857/kube-rbac-proxy/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.043542 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wr9vc_6f56c0b5-3d27-49e6-af5b-6ad929d9e857/nmstate-metrics/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.154472 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dmczs_ce7d9f29-28cd-4038-b492-b18e0b129907/nmstate-operator/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.260642 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-9cfpv_db5d41a4-2808-4189-8c3e-e0730cdf1a4f/nmstate-webhook/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.220962 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tdcbt_a2a3b82e-416b-4757-8719-97c58493428e/kube-rbac-proxy/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.303506 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tdcbt_a2a3b82e-416b-4757-8719-97c58493428e/controller/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.454444 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.694458 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.694581 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.707799 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.709526 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.962546 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.966434 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.973038 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.019937 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.208540 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.220687 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.224961 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.251541 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/controller/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.417208 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/kube-rbac-proxy/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.437463 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/frr-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.508136 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/kube-rbac-proxy-frr/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.656398 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/reloader/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.789296 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jhncx_3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de/frr-k8s-webhook-server/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.927248 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/frr/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.933441 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c589f6ccd-bcpmf_bb8f951b-6aa9-420c-9bad-dfa857482d4c/manager/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.097010 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-587dc5fb9c-2t48s_1eb0189c-2177-4c4e-83f6-7ba051322847/webhook-server/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.126554 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-brjk7_6ee4e9c2-66c1-4431-bde4-29d09a044a32/kube-rbac-proxy/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.398302 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-brjk7_6ee4e9c2-66c1-4431-bde4-29d09a044a32/speaker/0.log" Mar 20 11:49:48 crc kubenswrapper[4860]: I0320 11:49:48.737695 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.005382 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.039258 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.063122 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.274745 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.277482 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/extract/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.313786 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.486553 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.655100 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.695833 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.720266 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.094702 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/extract/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.128712 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.135963 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.299795 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.482549 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.543083 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.575827 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.794127 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.822618 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.840254 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/extract/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.000175 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.220456 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.220689 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.240104 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.424882 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.507581 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.717304 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.967472 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.978182 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.045991 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.073123 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/registry-server/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.258320 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.268891 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.528212 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/registry-server/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.545051 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qkfjv_489f9463-a47c-4635-aad3-866e47a2c97f/marketplace-operator/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.583893 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.795569 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.800670 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.802687 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.976629 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.046203 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.213744 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/registry-server/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.286476 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.485502 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.517136 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.532345 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.884123 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.904733 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:54 crc kubenswrapper[4860]: I0320 11:49:54.300943 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/registry-server/0.log" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.194697 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:00 crc kubenswrapper[4860]: E0320 11:50:00.195952 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.195967 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.196161 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.196859 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.201434 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.201917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.203255 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.207434 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.314401 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.415526 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.438617 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.545265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:01 crc kubenswrapper[4860]: I0320 11:50:01.017519 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:01 crc kubenswrapper[4860]: W0320 11:50:01.026473 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d1e9de_fef8_4113_b404_ee02a79e962c.slice/crio-db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6 WatchSource:0}: Error finding container db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6: Status 404 returned error can't find the container with id db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6 Mar 20 11:50:01 crc kubenswrapper[4860]: I0320 11:50:01.588463 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerStarted","Data":"db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6"} Mar 20 11:50:03 crc kubenswrapper[4860]: I0320 11:50:03.612970 4860 generic.go:334] "Generic (PLEG): container finished" podID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerID="ff3cd657f654c31af8a42160cee2d2823ab955307f4eefc8a66693a18aebac08" exitCode=0 Mar 20 11:50:03 crc kubenswrapper[4860]: I0320 11:50:03.613063 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerDied","Data":"ff3cd657f654c31af8a42160cee2d2823ab955307f4eefc8a66693a18aebac08"} Mar 20 11:50:04 crc kubenswrapper[4860]: I0320 11:50:04.920348 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.003141 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"c1d1e9de-fef8-4113-b404-ee02a79e962c\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.015517 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv" (OuterVolumeSpecName: "kube-api-access-c6rqv") pod "c1d1e9de-fef8-4113-b404-ee02a79e962c" (UID: "c1d1e9de-fef8-4113-b404-ee02a79e962c"). InnerVolumeSpecName "kube-api-access-c6rqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.105648 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630793 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerDied","Data":"db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6"} Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630880 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630893 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:06 crc kubenswrapper[4860]: I0320 11:50:06.038826 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:50:06 crc kubenswrapper[4860]: I0320 11:50:06.057505 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:50:07 crc kubenswrapper[4860]: I0320 11:50:07.424462 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" path="/var/lib/kubelet/pods/1d3485ef-9918-4aa6-80d1-c1c295d46ebe/volumes" Mar 20 11:50:49 crc kubenswrapper[4860]: I0320 11:50:49.690303 4860 scope.go:117] "RemoveContainer" containerID="7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452" Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.242260 4860 generic.go:334] "Generic (PLEG): container finished" podID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" exitCode=0 Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.242341 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerDied","Data":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.243914 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:18 crc kubenswrapper[4860]: I0320 11:51:18.045363 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/gather/0.log" Mar 20 11:51:22 crc kubenswrapper[4860]: I0320 11:51:22.345019 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:22 crc kubenswrapper[4860]: I0320 11:51:22.345544 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.230632 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.231740 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-972zc/must-gather-g5zkg" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" containerID="cri-o://e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" gracePeriod=2 Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.237340 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.728010 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/copy/0.log" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.728995 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.823002 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"537f47a7-01d4-449a-8afc-a83a212f4bc5\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.823065 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"537f47a7-01d4-449a-8afc-a83a212f4bc5\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.831697 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s" (OuterVolumeSpecName: "kube-api-access-dbg2s") pod "537f47a7-01d4-449a-8afc-a83a212f4bc5" (UID: "537f47a7-01d4-449a-8afc-a83a212f4bc5"). InnerVolumeSpecName "kube-api-access-dbg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.925184 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.925197 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "537f47a7-01d4-449a-8afc-a83a212f4bc5" (UID: "537f47a7-01d4-449a-8afc-a83a212f4bc5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.026597 4860 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.328932 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/copy/0.log" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331493 4860 generic.go:334] "Generic (PLEG): container finished" podID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" exitCode=143 Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331639 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331577 4860 scope.go:117] "RemoveContainer" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.356318 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420079 4860 scope.go:117] "RemoveContainer" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: E0320 11:51:27.420451 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": container with ID starting with e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba not found: ID does not exist" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420479 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba"} err="failed to get container status \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": rpc error: code = NotFound desc = could not find container \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": container with ID starting with e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba not found: ID does not exist" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420500 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.423294 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" path="/var/lib/kubelet/pods/537f47a7-01d4-449a-8afc-a83a212f4bc5/volumes" Mar 20 11:51:27 crc kubenswrapper[4860]: E0320 11:51:27.423341 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": container with ID starting with fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b not found: ID does not exist" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.423375 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} err="failed to get container status \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": rpc error: code = NotFound desc = could not find container \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": container with ID starting with fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b not found: ID does not exist" Mar 20 11:51:52 crc kubenswrapper[4860]: I0320 11:51:52.346375 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:52 crc kubenswrapper[4860]: I0320 11:51:52.347340 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.053928 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055408 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055429 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055465 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055474 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055495 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055502 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055729 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055746 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055757 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.057136 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.067689 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169238 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169710 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169856 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.271551 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272008 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272450 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272900 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.296126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.398471 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.918766 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596018 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" exitCode=0 Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596090 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34"} Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596647 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"76d0ff7a0d4f9572fcd6864eccc27b2ca9db2cb75e81302c9cd1d8a7f3b9b3fd"} Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.599300 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:51:59 crc kubenswrapper[4860]: I0320 11:51:59.608866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.151737 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.152958 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157706 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157809 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157978 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.162474 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.324131 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.425424 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.450387 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.520696 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.648643 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" exitCode=0 Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.648893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.976920 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.663671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.665589 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerStarted","Data":"ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7"} Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.705529 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrg5v" podStartSLOduration=2.197807248 podStartE2EDuration="4.705491877s" podCreationTimestamp="2026-03-20 11:51:57 +0000 UTC" firstStartedPulling="2026-03-20 11:51:58.598896533 +0000 UTC m=+3442.820257431" lastFinishedPulling="2026-03-20 11:52:01.106581162 +0000 UTC m=+3445.327942060" observedRunningTime="2026-03-20 11:52:01.695069526 +0000 UTC m=+3445.916430414" watchObservedRunningTime="2026-03-20 11:52:01.705491877 +0000 UTC m=+3445.926852775" Mar 20 11:52:02 crc kubenswrapper[4860]: I0320 11:52:02.675010 4860 generic.go:334] "Generic (PLEG): container finished" podID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerID="8536cbf5e2641eae0dadbfbbe6f7cd6043ca90153b3252b63e0cfc33da21beb0" exitCode=0 Mar 20 11:52:02 crc kubenswrapper[4860]: I0320 11:52:02.675076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerDied","Data":"8536cbf5e2641eae0dadbfbbe6f7cd6043ca90153b3252b63e0cfc33da21beb0"} Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.036006 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.185996 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"2a33a045-cd5c-4295-8afd-92b36e24a572\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.200476 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh" (OuterVolumeSpecName: "kube-api-access-f72mh") pod "2a33a045-cd5c-4295-8afd-92b36e24a572" (UID: "2a33a045-cd5c-4295-8afd-92b36e24a572"). InnerVolumeSpecName "kube-api-access-f72mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.287926 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692372 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerDied","Data":"ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7"} Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692769 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692451 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.127510 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.134312 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.425086 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" path="/var/lib/kubelet/pods/a69dccb3-324d-47b3-92d0-af9fc224932d/volumes" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.398981 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.399483 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.442811 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.764533 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.824931 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:09 crc kubenswrapper[4860]: I0320 11:52:09.728141 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrg5v" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" containerID="cri-o://22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" gracePeriod=2 Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.204737 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388569 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388611 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.389911 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities" (OuterVolumeSpecName: "utilities") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.398448 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74" (OuterVolumeSpecName: "kube-api-access-vmb74") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "kube-api-access-vmb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.446179 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490746 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490830 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490844 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747048 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" exitCode=0 Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747102 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747138 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"76d0ff7a0d4f9572fcd6864eccc27b2ca9db2cb75e81302c9cd1d8a7f3b9b3fd"} Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747157 4860 scope.go:117] "RemoveContainer" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747338 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.771596 4860 scope.go:117] "RemoveContainer" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.786651 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.793719 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.812460 4860 scope.go:117] "RemoveContainer" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.842934 4860 scope.go:117] "RemoveContainer" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.843718 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": container with ID starting with 22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf not found: ID does not exist" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.843809 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} err="failed to get container status \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": rpc error: code = NotFound desc = could not find container \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": container with ID starting with 22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf not found: ID does not exist" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.843863 4860 scope.go:117] "RemoveContainer" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.844614 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": container with ID starting with c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df not found: ID does not exist" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.844648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} err="failed to get container status \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": rpc error: code = NotFound desc = could not find container \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": container with ID starting with c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df not found: ID does not exist" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.844665 4860 scope.go:117] "RemoveContainer" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.845161 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": container with ID starting with eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34 not found: ID does not exist" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.845209 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34"} err="failed to get container status \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": rpc error: code = NotFound desc = could not find container \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": container with ID starting with eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34 not found: ID does not exist" Mar 20 11:52:11 crc kubenswrapper[4860]: I0320 11:52:11.435367 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60759547-601f-4452-b887-94820dba6b6c" path="/var/lib/kubelet/pods/60759547-601f-4452-b887-94820dba6b6c/volumes" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.344363 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.345105 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.345166 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.346062 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.346142 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd" gracePeriod=600 Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866450 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd" exitCode=0 Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866533 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866953 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"c3a6e4b824ce80b190435234a888f23469947223fd7d2c0597395741b3d52f34"} Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866986 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:52:49 crc kubenswrapper[4860]: I0320 11:52:49.840847 4860 scope.go:117] "RemoveContainer" containerID="02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.165925 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166902 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166917 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166926 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166933 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166953 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166961 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166987 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166995 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167157 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167172 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167666 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.177775 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.179048 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.180715 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.187056 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.277037 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.378983 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.406972 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.421488 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.425015 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.442862 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481517 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481601 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481637 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.492596 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.582946 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583033 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583824 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.616435 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.762263 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.025543 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.096908 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:01 crc kubenswrapper[4860]: W0320 11:54:01.107744 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68089545_e05b_4352_b47d_37ad7ae7bd55.slice/crio-49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e WatchSource:0}: Error finding container 49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e: Status 404 returned error can't find the container with id 49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.880185 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerStarted","Data":"75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92"} Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884343 4860 generic.go:334] "Generic (PLEG): container finished" podID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerID="b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f" exitCode=0 Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884418 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f"} Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884462 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.816180 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.818370 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.836288 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.908492 4860 generic.go:334] "Generic (PLEG): container finished" podID="1babb09d-4647-48e8-bb5a-ea00aa1e0a89" containerID="027bfeeca9ef5489ad1ac6e60f7135ea097246b42bf09aa60991a6ad1192d512" exitCode=0 Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.908624 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerDied","Data":"027bfeeca9ef5489ad1ac6e60f7135ea097246b42bf09aa60991a6ad1192d512"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.913755 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.926707 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.927108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.927286 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030444 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030549 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030585 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.031428 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.031931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.066671 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.137157 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.728520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.925454 4860 generic.go:334] "Generic (PLEG): container finished" podID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerID="484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483" exitCode=0 Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.925551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483"} Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.929180 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5"} Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.929249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"a4f9dcd43db44e4ce7dee893eb45c95c802e80f34966961a8a158af1d1e14255"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.314690 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.481523 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.489476 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p" (OuterVolumeSpecName: "kube-api-access-btg9p") pod "1babb09d-4647-48e8-bb5a-ea00aa1e0a89" (UID: "1babb09d-4647-48e8-bb5a-ea00aa1e0a89"). InnerVolumeSpecName "kube-api-access-btg9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.583452 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948061 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerDied","Data":"75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948731 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.959054 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.962018 4860 generic.go:334] "Generic (PLEG): container finished" podID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerID="330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5" exitCode=0 Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.962089 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerDied","Data":"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5"} Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.006960 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pxcc" podStartSLOduration=2.214376319 podStartE2EDuration="5.006251899s" podCreationTimestamp="2026-03-20 11:54:00 +0000 UTC" firstStartedPulling="2026-03-20 11:54:01.887191797 +0000 UTC m=+3566.108552695" lastFinishedPulling="2026-03-20 11:54:04.679067377 +0000 UTC m=+3568.900428275" observedRunningTime="2026-03-20 11:54:05.002048845 +0000 UTC m=+3569.223409743" watchObservedRunningTime="2026-03-20 11:54:05.006251899 +0000 UTC m=+3569.227612807" Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.438693 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.440730 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.974367 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d"} Mar 20 11:54:06 crc kubenswrapper[4860]: I0320 11:54:06.985892 4860 generic.go:334] "Generic (PLEG): container finished" podID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerID="a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d" exitCode=0 Mar 20 11:54:06 crc kubenswrapper[4860]: I0320 11:54:06.985981 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerDied","Data":"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d"} Mar 20 11:54:07 crc kubenswrapper[4860]: I0320 11:54:07.425675 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" path="/var/lib/kubelet/pods/0f342006-66c4-4bc6-9577-1aa4db4b4210/volumes" Mar 20 11:54:07 crc kubenswrapper[4860]: I0320 11:54:07.999295 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e"} Mar 20 11:54:08 crc kubenswrapper[4860]: I0320 11:54:08.021814 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fdtds" podStartSLOduration=3.5371499269999997 podStartE2EDuration="6.021785542s" podCreationTimestamp="2026-03-20 11:54:02 +0000 UTC" firstStartedPulling="2026-03-20 11:54:04.964129501 +0000 UTC m=+3569.185490399" lastFinishedPulling="2026-03-20 11:54:07.448765116 +0000 UTC m=+3571.670126014" observedRunningTime="2026-03-20 11:54:08.021137154 +0000 UTC m=+3572.242498052" watchObservedRunningTime="2026-03-20 11:54:08.021785542 +0000 UTC m=+3572.243146440" Mar 20 11:54:10 crc kubenswrapper[4860]: I0320 11:54:10.762802 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:10 crc kubenswrapper[4860]: I0320 11:54:10.764890 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:10 crc kubenswrapper[4860]: I0320 11:54:10.808837 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:11 crc kubenswrapper[4860]: I0320 11:54:11.062640 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:13 crc kubenswrapper[4860]: I0320 11:54:13.138273 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:13 crc kubenswrapper[4860]: I0320 11:54:13.138788 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.193704 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fdtds" podUID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerName="registry-server" probeResult="failure" output=< Mar 20 11:54:14 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:54:14 crc kubenswrapper[4860]: > Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.215102 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.215529 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pxcc" podUID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerName="registry-server" containerID="cri-o://884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0" gracePeriod=2 Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.640788 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.758602 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"68089545-e05b-4352-b47d-37ad7ae7bd55\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.758776 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"68089545-e05b-4352-b47d-37ad7ae7bd55\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.758805 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"68089545-e05b-4352-b47d-37ad7ae7bd55\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.760096 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities" (OuterVolumeSpecName: "utilities") pod "68089545-e05b-4352-b47d-37ad7ae7bd55" (UID: "68089545-e05b-4352-b47d-37ad7ae7bd55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.772997 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5" (OuterVolumeSpecName: "kube-api-access-hhtm5") pod "68089545-e05b-4352-b47d-37ad7ae7bd55" (UID: "68089545-e05b-4352-b47d-37ad7ae7bd55"). InnerVolumeSpecName "kube-api-access-hhtm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.794764 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68089545-e05b-4352-b47d-37ad7ae7bd55" (UID: "68089545-e05b-4352-b47d-37ad7ae7bd55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.860643 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.860693 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:14 crc kubenswrapper[4860]: I0320 11:54:14.860705 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.058109 4860 generic.go:334] "Generic (PLEG): container finished" podID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerID="884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0" exitCode=0 Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.058185 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0"} Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.058209 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.058257 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e"} Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.058287 4860 scope.go:117] "RemoveContainer" containerID="884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.079952 4860 scope.go:117] "RemoveContainer" containerID="484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.100545 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.109849 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.120729 4860 scope.go:117] "RemoveContainer" containerID="b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.138950 4860 scope.go:117] "RemoveContainer" containerID="884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0" Mar 20 11:54:15 crc kubenswrapper[4860]: E0320 11:54:15.139613 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0\": container with ID starting with 884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0 not found: ID does not exist" containerID="884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.139775 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0"} err="failed to get container status \"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0\": rpc error: code = NotFound desc = could not find container \"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0\": container with ID starting with 884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0 not found: ID does not exist" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.139898 4860 scope.go:117] "RemoveContainer" containerID="484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483" Mar 20 11:54:15 crc kubenswrapper[4860]: E0320 11:54:15.140491 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483\": container with ID starting with 484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483 not found: ID does not exist" containerID="484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.140526 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483"} err="failed to get container status \"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483\": rpc error: code = NotFound desc = could not find container \"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483\": container with ID starting with 484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483 not found: ID does not exist" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.140542 4860 scope.go:117] "RemoveContainer" containerID="b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f" Mar 20 11:54:15 crc kubenswrapper[4860]: E0320 11:54:15.140820 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f\": container with ID starting with b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f not found: ID does not exist" containerID="b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.140864 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f"} err="failed to get container status \"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f\": rpc error: code = NotFound desc = could not find container \"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f\": container with ID starting with b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f not found: ID does not exist" Mar 20 11:54:15 crc kubenswrapper[4860]: I0320 11:54:15.424576 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68089545-e05b-4352-b47d-37ad7ae7bd55" path="/var/lib/kubelet/pods/68089545-e05b-4352-b47d-37ad7ae7bd55/volumes" Mar 20 11:54:22 crc kubenswrapper[4860]: I0320 11:54:22.343749 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:54:22 crc kubenswrapper[4860]: I0320 11:54:22.345324 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:54:23 crc kubenswrapper[4860]: I0320 11:54:23.196975 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:23 crc kubenswrapper[4860]: I0320 11:54:23.244081 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:23 crc kubenswrapper[4860]: I0320 11:54:23.436852 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:25 crc kubenswrapper[4860]: I0320 11:54:25.142360 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fdtds" podUID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerName="registry-server" containerID="cri-o://8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e" gracePeriod=2 Mar 20 11:54:25 crc kubenswrapper[4860]: I0320 11:54:25.853428 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.017016 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.017149 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.018194 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities" (OuterVolumeSpecName: "utilities") pod "704d95d6-aca3-4174-b2ac-985b2bfbeb5d" (UID: "704d95d6-aca3-4174-b2ac-985b2bfbeb5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.018354 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.018665 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.023727 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc" (OuterVolumeSpecName: "kube-api-access-q76sc") pod "704d95d6-aca3-4174-b2ac-985b2bfbeb5d" (UID: "704d95d6-aca3-4174-b2ac-985b2bfbeb5d"). InnerVolumeSpecName "kube-api-access-q76sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.121005 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.154726 4860 generic.go:334] "Generic (PLEG): container finished" podID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerID="8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e" exitCode=0 Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.154796 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerDied","Data":"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e"} Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.154821 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.154847 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerDied","Data":"a4f9dcd43db44e4ce7dee893eb45c95c802e80f34966961a8a158af1d1e14255"} Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.154871 4860 scope.go:117] "RemoveContainer" containerID="8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.163841 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "704d95d6-aca3-4174-b2ac-985b2bfbeb5d" (UID: "704d95d6-aca3-4174-b2ac-985b2bfbeb5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.182054 4860 scope.go:117] "RemoveContainer" containerID="a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.204711 4860 scope.go:117] "RemoveContainer" containerID="330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.222670 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.233008 4860 scope.go:117] "RemoveContainer" containerID="8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e" Mar 20 11:54:26 crc kubenswrapper[4860]: E0320 11:54:26.233660 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e\": container with ID starting with 8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e not found: ID does not exist" containerID="8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.233710 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e"} err="failed to get container status \"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e\": rpc error: code = NotFound desc = could not find container \"8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e\": container with ID starting with 8aab8a407291816fc2e41a3ac046913ddc163a7bcd6ccf90737921590da1888e not found: ID does not exist" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.233770 4860 scope.go:117] "RemoveContainer" containerID="a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d" Mar 20 11:54:26 crc kubenswrapper[4860]: E0320 11:54:26.234437 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d\": container with ID starting with a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d not found: ID does not exist" containerID="a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.234473 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d"} err="failed to get container status \"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d\": rpc error: code = NotFound desc = could not find container \"a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d\": container with ID starting with a50d98eb7fcffb294ffbe16450e711b18f853e32d7c394b9bf91310063f9301d not found: ID does not exist" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.234506 4860 scope.go:117] "RemoveContainer" containerID="330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5" Mar 20 11:54:26 crc kubenswrapper[4860]: E0320 11:54:26.234975 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5\": container with ID starting with 330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5 not found: ID does not exist" containerID="330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.234997 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5"} err="failed to get container status \"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5\": rpc error: code = NotFound desc = could not find container \"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5\": container with ID starting with 330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5 not found: ID does not exist" Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.498783 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:26 crc kubenswrapper[4860]: I0320 11:54:26.505951 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:27 crc kubenswrapper[4860]: I0320 11:54:27.440506 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" path="/var/lib/kubelet/pods/704d95d6-aca3-4174-b2ac-985b2bfbeb5d/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157232617024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157232620017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157223166016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157223166015465 5ustar corecore